By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: What it’s like to wear Amazon’s new smart glasses for delivery drivers
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > What it’s like to wear Amazon’s new smart glasses for delivery drivers
Computing

What it’s like to wear Amazon’s new smart glasses for delivery drivers

News Room
Last updated: 2025/10/26 at 8:44 PM
News Room Published 26 October 2025
Share
SHARE
GeekWire’s Todd Bishop tries Amazon’s new smart delivery glasses in a simulated demo.

SAN FRANCISCO — Putting on Amazon’s new smart delivery glasses felt surprisingly natural from the start. Despite their high-tech components and slightly bulky design, they were immediately comfortable and barely heavier than my normal glasses.

Then a few lines of monochrome green text and a square target popped up in the right-hand lens — reminding me that these were not my regular frames. 

Occupying just a portion of my total field of view, the text showed an address and a sorting code: “YLO 339.” As I learned, “YLO” represented the yellow tote bag where the package would normally be found, and “339” was a special code on the package label.

My task: find the package with that code. Or more precisely, let the glasses find them.

Amazon image from a separate demo, showing the process of scanning packages with the new glasses.

As soon as I looked at the correct package label, the glasses recognized the code and scanned the label automatically. A checkmark appeared on a list of packages in the glasses.

Then an audio alert played from the glasses: “Dog on property.”

When all the packages were scanned, the tiny green display immediately switched to wayfinding mode. A simple map appeared, showing my location as a dot, and the delivery destination marked with pins. In this simulation, there were two pins, indicating two stops at this location. 

After navigating to the doorstep, it was time for the final step: proof of delivery. Instead of reaching for a phone, I looked at the package on the doorstep and pressed a button once on the small controller unit —the “compute puck” — on my harness. The glasses captured a photo. 

With that, my simulated delivery was done, without ever touching a handheld device.

In my very limited experience, the biggest concern I had was the potential to be distracted — focusing my attention on the text in front of my eyes rather than the world around me. I understand now why the display automatically turns off when a van is in motion. 

But when I mentioned that concern to the Amazon leaders guiding me through the demo, they pointed out that the alternative is looking down at a device. With the glasses, your gaze is up and largely unobstructed, theoretically making it much easier to notice possible hazards. 

Beyond the fact that they’re not intended for public release, the simplicity is a key difference between Amazon’s utilitarian design and other augmented reality devices — such as Meta Ray-Bans, Apple Vision Pro, and Magic Leap — which aim to more fully enhance or overlay the user’s environment.

One driver’s experience

KC Pangan, who delivers Amazon packages in San Francisco and was featured in Amazon’s demo video, said wearing the glasses has become so natural that he barely notices them. 

Pangan has been part of a company study for the past two months. On the rare occasions when he switches back to the old handheld device, he finds himself thinking, “Oh, this thing again.”

“The best thing about them is being hands-free,” Pangan said in a conversation on the sidelines of the Amazon Delivering the Future event, where the glasses were unveiled last week.

Without needing to look down at a handheld device, he can keep his eyes up and stay alert for potential hazards. With another hand free, he can maintain the all-important three points of contact when climbing in or out of a vehicle, and more easily carry packages and open gates.

The glasses, he said, “do practically everything for me” — taking photos, helping him know where to walk, and showing his location relative to his van. 

While Amazon emphasizes safety and driver experience as the primary goals, early tests hint at efficiency gains, as well. In initial tests, Amazon has seen up to 30 minutes of time savings per shift, although execs cautioned that the results are preliminary and could change with wider testing.

KC Pangan, an Amazon delivery driver in San Francisco who has been part of a pilot program for the new glasses. (GeekWire Photo / Todd Bishop)

Using the glasses will be fully optional for both its Delivery Service Partners (DSPs) and their drivers, even when it’s fully rolled out, according to the company. The system also includes privacy features, such as a hardware button that allows drivers to turn off all sensors.

For those who use them, the company says it plans to provide the devices at no cost. 

Despite the way it may look to the public, Amazon doesn’t directly employ the drivers who deliver its packages in Amazon-branded vans and uniforms. Instead, it contracts with DSPs, ostensibly independent companies that hire drivers and manage package deliveries from inside Amazon facilities. 

This arrangement has periodically sparked friction, and even lawsuits, as questions have come up over DSP autonomy and accountability.

With the introduction of smart glasses and other tech initiatives, including a soon-to-be-expanded training program, Amazon is deepening its involvement with DSPs — potentially raising more questions about who truly controls the delivery workforce.

Regulators, legislators and employees have raised red flags over new technology pushing Amazon fulfillment workers to the limits of human capacity and safety. Amazon disputes this premise, and calls the new glasses part of a larger effort to use technology to improve safety.

From ‘moonshot’ to reality

The smart glasses, still in their prototype phase, trace their origins to a brainstorming session about five years ago, said Beryl Tomay, Amazon’s vice president of transportation.

Each year, the team brainstorms big ideas for the company’s delivery system — and during one of those sessions, a question emerged: What if drivers didn’t have to interact with any technology at all?  

“The moonshot idea we came up with was, what if there was no technology that the driver had to interact with — and they could just follow the physical process of delivering a package from the van to the doorstep?” Tomay said in an interview. “How do we make that happen so they don’t have to use a phone or any kind of tech that they have to fiddle with?”

Beryl Tomay, Amazon’s vice president of transportation, introduces the smart glasses at Amazon’s Delivering the Future event. (GeekWire Photo / Todd Bishop)

That question led the team to experiment with different approaches before settling on glasses. It seemed kind of crazy at first, Tomay said, but they soon realized the potential to improve safety and the driver experience. Early trials with delivery drivers confirmed the theory.

“The hands-free aspect of it was just kind of magical,” she said, summing up the reaction from early users.

The project has already been tested with hundreds of delivery drivers across more than a dozen DSPs. Amazon plans to expand those trials in the coming months, with a larger test scheduled for November. The goal is to collect more feedback before deciding when the technology will be ready for wider deployment.

Typically, Amazon would have kept a new hardware project secret until later in its development. But Reuters reported on the existence of the project nearly a year ago. (The glasses were reportedly code-named “Amelia,” but they were announced without a name.) And this way, Amazon can get more delivery partners involved, get input, and make improvements.

Future versions may also expand the system’s capabilities, using sensors and data to automatically recognize potential hazards such as uneven walkways.

How the technology works

Amazon’s smart glasses are part of a larger system that also includes a small wearable computer and a battery, integrated with Amazon’s delivery software and vehicle systems.

The lenses are photochromatic, darkening automatically in bright sunlight, and can be fitted with prescription inserts. Two cameras — one centered, one on the left — support functions such as package scanning and photo capture for proof of delivery. 

A built-in flashlight switches on automatically in dim conditions, while onboard sensors help the system orient to the driver’s movement and surroundings.

Amazon executive Viraj Chatterjee and driver KC Pangan demonstrate the smart glasses.

The glasses connect by a magnetic wire to a small controller unit, or “compute puck,” worn on the chest of a heat-resistant harness. The controller houses the device’s AI models, manages the visual display, and handles functions such as taking a delivery photo. It also includes a dedicated emergency button that connects drivers directly to Amazon’s emergency support systems.

On the opposite side of the chest, a swappable battery keeps the system balanced and running for a full route. Both components are designed for all-day comfort — the result, Tomay said, of extensive testing with drivers to ensure the wearing the gear feels natural when they’re moving around.

Connectivity runs through the driver’s official Amazon delivery phone via Bluetooth, and through the vehicle itself using a platform called “Fleet Edge” — a network of sensors and onboard computing modules that link the van’s status to the glasses. 

This connection allows the glasses to know precisely when to activate, when to shut down, and when to sync data. When a van is put in park, the display automatically activates, showing details such as addresses, navigation cues, and package information. When the vehicle starts moving again, the display turns off — a deliberate safety measure so drivers never see visual data while driving.

Data gathered by the glasses plays a role in Amazon’s broader mapping efforts. Imagery and sensor data feed into “Project Wellspring,” a system that uses AI to better model the physical world. This helps Amazon refine maps, identify the safest parking spots, pinpoint building entrances, and optimize walking routes for future deliveries.

Amazon says the data collection is done with privacy in mind. In addition to the driver-controlled sensor shut-off button, any imagery collected is processed to “blur or remove personally identifiable information” such as faces and license plates before being stored or used.

The implications go beyond routing and navigation. Conceivably, the same data could also lay the groundwork for greater automation in Amazon’s delivery network over time.

Testing the delivery training

In addition to trying the glasses during the event at Amazon’s Delivery Station in Milpitas, Calif., I experienced firsthand just how difficult the job of delivering packages can be. 

GeekWire’s Todd Bishop uses an Amazon training program that teaches drivers to walk safely on slippery surfaces.
  • Strapped into a harness for a slip-and-fall demo, I learned how easily a driver can lose footing on slick surfaces if not careful to walk properly. 
  • I tried a VR training device that highlighted hidden hazards like pets sleeping under tires and taught me how to navigate complex intersections safely.
  • My turn in the company’s Rivian van simulator proved humbling. Despite my best efforts, I ran red lights and managed to crash onto virtual sidewalks.
GeekWire’s Todd Bishop after a highly unsuccessful attempt to use Amazon’s driving simulator.

The simulator, known as the Enhanced Vehicle Operation Learning Virtual Experience (EVOLVE), has been launched at Amazon facilities in Colorado, Maryland, and Florida, and Amazon says it will be available at 40 sites by the end of 2026. 

It’s part of what’s known as the Integrated Last Mile Driver Academy (iLMDA), a program available at 65 sites currently, which Amazon says it plans to expand to more than 95 delivery stations across North America by the end of 2026.

“Drivers are autonomous on the road, and the amount of variables that they interact with on a given day are countless,” said Anthony Mason, Amazon’s director of delivery training and programs, who walked me through the training demos. One goal of the training, he said, is to give drivers a toolkit to pull from when they face challenging situations.

Suffice it to say, this is not the job for me. But if Amazon’s smart glasses live up to its expectations, they might be a step forward for the drivers doing the real work.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article ChatGPT’s Atlas browser already has a big advantage over Gemini in Chrome
Next Article Accel and Prosus team up to back early-stage Indian startups | News
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Enhance Your Nintendo Games: Here’s How Much a Switch 2 Upgrade Pack Costs
News
After new funding, Noetix Robotics explains how it built a humanoid robot cheaper than an iPhone · TechNode
Computing
Premier League Soccer: Stream Aston Villa vs. Man City Live From Anywhere
News
Apple Hosts Special Vision Pro Event for Developers
News

You Might also Like

Computing

After new funding, Noetix Robotics explains how it built a humanoid robot cheaper than an iPhone · TechNode

3 Min Read
Computing

Tencent’s auto chess game Honor of Kings: Wanxiang Chess to begin large-scale testing in December · TechNode

1 Min Read
Computing

Countdown to XIN Summit 2025 in Shenzhen — Only 20 Booths Left at the Global Hub for Smart Hardware Innovation! · TechNode

3 Min Read
Computing

NIO claims battery swaps surpass 90 million, daily swaps exceed 100,000 · TechNode

1 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?