Snap might be best known for the popular Snapchat app, but after previewing the company’s new Snap OS 2.0, due to ship on its consumer-focused AR glasses next year, that might soon change.
I recently went down to Snap’s London HQ to get a taste of what to expect – in terms of software – from the company’s upcoming AR glasses, and I was pleasantly surprised, to say the least.
Unlike the fairly basic AR experiences we’ve seen so far, the minds behind Snapchat have created what I can only describe as the most comprehensive AR experience I’ve seen yet. And, with the consumer glasses confirmed for launch in 2026, we don’t have that long to experience it for ourselves either.
Could Snap be poised to rebrand as a leader in the smart glasses market? After spending time with the latest software, I’m quietly confident.
AR software is pretty basic right now
I’ve tried a few pairs of ‘true’ smart glasses with embedded screens this year, including the popular Even Realities G1 and, more recently, the Rokid AR Glasses. However, there’s a common theme: the software is relatively basic compared to what we’re accustomed to from phones and even smartwatches.
Sure, the Even Realities G1 provides access to notifications, stocks, map directions and Even AI, the company’s proprietary chatbot powered by ChatGPT, but the experience is static. The information is displayed in a fixed position on the specs, accessible when glancing up, and acts more like an overlay than a true augmented reality experience that blurs the lines between what’s real and what isn’t.

It’s a similar story with the Rokid Glasses. They have similar functionality, with the addition of real-time translation powered by beamforming microphones that managed to work on a busy show floor without much issue, and its navigation app is powered by the Google Maps API – but like the G1, it’s very much a static experience.
It’s not just the static nature of AR glasses software though; most screens only display a single colour, usually green. It’s very Matrix-esque, but again, it’s limited. You couldn’t exactly watch a movie with these specs.
Some glasses focus on high-quality optics, like the Xreal Air 2 Pro, but these aren’t glasses that you can wear day in, day out. I’d compare them more to a VR headset than true smart glasses, designed for watching TV and connecting to your PC for work, but not general use.
Don’t get me wrong, these smart specs are a solid first step for consumer-friendly AR glasses, especially in terms of hardware and design, but the software is still too basic for most to justify the high-end price tags.
And that’s what makes Snapchat’s new Snap OS 2.0 so exciting.
Snap OS 2 offers the Sci-fi AR experience we were promised
Snapchat’s software is so far ahead of the competition that it feels like it’s truly from the future.
But then again, I shouldn’t be that surprised; the company was one of the first to start development on smart glasses with the release of the original Snapchat Spectacles back in 2016, and it pivoted to AR pretty soon after. That means it has had quite some time to perfect the software for its long-awaited consumer-focused AR glasses, and it really shows.
Donning the fifth-gen developer-focused Spectacles that launched for developers last year at Snap’s London HQ, I was immediately impressed with just how advanced the new Snap OS 2.0 update is.


For one, it offers the true mixed reality experience promised by early 2000s sci-fi movies, with full-colour graphics that integrated with the world around me in a natural way. What’s more, these virtual elements were anchored into the real world, meaning I could look or even walk away, then turn back and see them in the same place I put them.
There are no tap or swipe gestures on the developer glasses, simply because the software supports hand-tracking functionality. I only had to look down at the palm of my hand and use my other hand to tap on the various buttons that appeared. It not only feels futuristic, but the fact that I’m tapping on my hand and not at an invisible button made it feel more ‘real’.
It also meant that I could move or resize apps simply by grabbing them, stretching them and moving them into my preferred position. No menus, no buttons or anything fiddly; just a refreshingly physical experience.
That core experience has been bolstered by new additions in Snap OS 2.0, including a redesigned browser that lets you browse the web and watch content while, for example, washing dishes or pulling up a YouTube tutorial while fixing a bike. It also supports WebXR, potentially opening the door for AR-focused experiences without the need to download an app.


There’s also, rather unsurprisingly, tight integration with the core Snapchat app and functionality.
The new Spotlight Lens provides access to Snapchat’s Spotlight feed, allowing you to watch content from Snap creators without your phone, while the updated Gallery Lens allows you to look at your Spectacle captures on a much larger screen than your smartphone.
In fact, there are already plenty of apps, or ‘Lenses’, available for the specs, both from Snap itself and developers already creating content for the platform – but more on the latter a little later.
And, using the built-in cameras, you can, of course, capture content directly from your glasses for posting on Snapchat.
Now that’s already a pretty fantastic AR experience compared to what’s available right now – but it’s Snap’s use of GenAI that really impressed me.


Probably the best use of GenAI yet
If you’re anything like me, you’ve probably got a touch of AI fatigue by this point. GenAI is being thrust into tech of all shapes and sizes, from smartphones and smartwatches to smart home appliances – and if we’re being honest, it’s not always that helpful.
That’s certainly not the case with Snap’s AI integration, however. In fact, I’d go as far as to say it’s probably the best, most intuitive use of AI yet.
Snap calls it Spatial Tips, and it essentially allows the company’s AI to not only see what you see via embedded screens, but actually integrate its responses into the real world.


To get a flavour of what it could do, I simply asked to label what it sees, and it did so – both faster than expected, and way more accurately. It even marked spare pairs of Snap’s developer specs as augmented reality glasses, as well as labelling elements like laptops, chairs, tables and anything else around at the time.
However, it was when I started asking more specific questions that things truly got exciting. I looked at a skateboard and asked the AI how to do an ollie. Rather than reeling off a long voice-powered response that does little to actually guide you, it overlaid the steps onto the skateboard itself, showing me where to place my feet at each step to perform an ollie correctly.


Imagine being able to look at an IKEA flat-pack and receive step-by-step instructions on how to assemble it, or being guided through fixing something under the hood of your car. It’s so much more intuitive than what’s provided by Gemini or ChatGPT, and completely ditches the need for instruction manuals and the confusing diagrams that accompany them.
This is the killer use of AI – it’s concise, clear and with visual aids guiding you, it could genuinely make a difference in your day-to-day life.
AI also powers the Super Travel Lens, which translates signs, menus and more, simply by outlining what you want translated with your finger. Currently, it’ll provide a translated image, but Snap hopes to directly overlay it with real-world text by launch.


There are also fantastic translation capabilities, with the ability to translate up to 40 languages in real-time, complete with closed captions beneath each speaker that ‘stick’ to them as they move around. It’ll also be a game-changer for the hard of hearing, with what could be a universal closed caption device.
Coming in 2026, and devs are already creating apps
The most exciting thing about all this is that Snap’s consumer-focused AR glasses are set for release sometime in 2026 – so we’re not that far from experiencing it ourselves. Don’t be put off by the large, chunky Spectacles I wore during my experience either – Snap has confirmed that its consumer specs will be much more compact.
What’s more, with development hardware in the hands of developers around the world, there’s already a growing list of apps and experiences designed for the platform. That means there should be an abundance of things to do with the glasses from day one, with no awkward phase with limited app availability – I’m looking at you, Apple Vision Pro.
At a time when we’re seeing either expensive AR prototypes or smart specs with rather limited AR capabilities, Snap’s AR glasses could truly stand out – and I can’t wait to try them out for myself in 2026.