SAN FRANCISCO — Putting on Amazon’s new smart delivery glasses felt surprisingly natural from the start. Despite their high-tech components and slightly bulky design, they were immediately comfortable and barely heavier than my normal glasses.
Then a few lines of monochrome green text and a square target popped up in the right-hand lens — reminding me that these were not my regular frames.
Occupying just a portion of my total field of view, the text showed an address and a sorting code: “YLO 339.” As I learned, “YLO” represented the yellow tote bag where the package would normally be found, and “339” was a special code on the package label.
My task: find the package with that code. Or more precisely, let the glasses find them.

As soon as I looked at the correct package label, the glasses recognized the code and scanned the label automatically. A checkmark appeared on a list of packages in the glasses.
Then an audio alert played from the glasses: “Dog on property.”
When all the packages were scanned, the tiny green display immediately switched to wayfinding mode. A simple map appeared, showing my location as a dot, and the delivery destination marked with pins. In this simulation, there were two pins, indicating two stops at this location.
After navigating to the doorstep, it was time for the final step: proof of delivery. Instead of reaching for a phone, I looked at the package on the doorstep and pressed a button once on the small controller unit —the “compute puck” — on my harness. The glasses captured a photo.
With that, my simulated delivery was done, without ever touching a handheld device.
In my very limited experience, the biggest concern I had was the potential to be distracted — focusing my attention on the text in front of my eyes rather than the world around me. I understand now why the display automatically turns off when a van is in motion.
But when I mentioned that concern to the Amazon leaders guiding me through the demo, they pointed out that the alternative is looking down at a device. With the glasses, your gaze is up and largely unobstructed, theoretically making it much easier to notice possible hazards.
Beyond the fact that they’re not intended for public release, the simplicity is a key difference between Amazon’s utilitarian design and other augmented reality devices — such as Meta Ray-Bans, Apple Vision Pro, and Magic Leap — which aim to more fully enhance or overlay the user’s environment.
One driver’s experience
KC Pangan, who delivers Amazon packages in San Francisco and was featured in Amazon’s demo video, said wearing the glasses has become so natural that he barely notices them.
Pangan has been part of a company study for the past two months. On the rare occasions when he switches back to the old handheld device, he finds himself thinking, “Oh, this thing again.”
“The best thing about them is being hands-free,” Pangan said in a conversation on the sidelines of the Amazon Delivering the Future event, where the glasses were unveiled last week.
Without needing to look down at a handheld device, he can keep his eyes up and stay alert for potential hazards. With another hand free, he can maintain the all-important three points of contact when climbing in or out of a vehicle, and more easily carry packages and open gates.
The glasses, he said, “do practically everything for me” — taking photos, helping him know where to walk, and showing his location relative to his van.
While Amazon emphasizes safety and driver experience as the primary goals, early tests hint at efficiency gains, as well. In initial tests, Amazon has seen up to 30 minutes of time savings per shift, although execs cautioned that the results are preliminary and could change with wider testing.

Using the glasses will be fully optional for both its Delivery Service Partners (DSPs) and their drivers, even when it’s fully rolled out, according to the company. The system also includes privacy features, such as a hardware button that allows drivers to turn off all sensors.
For those who use them, the company says it plans to provide the devices at no cost.
Despite the way it may look to the public, Amazon doesn’t directly employ the drivers who deliver its packages in Amazon-branded vans and uniforms. Instead, it contracts with DSPs, ostensibly independent companies that hire drivers and manage package deliveries from inside Amazon facilities.
This arrangement has periodically sparked friction, and even lawsuits, as questions have come up over DSP autonomy and accountability.
With the introduction of smart glasses and other tech initiatives, including a soon-to-be-expanded training program, Amazon is deepening its involvement with DSPs — potentially raising more questions about who truly controls the delivery workforce.
Regulators, legislators and employees have raised red flags over new technology pushing Amazon fulfillment workers to the limits of human capacity and safety. Amazon disputes this premise, and calls the new glasses part of a larger effort to use technology to improve safety.
From ‘moonshot’ to reality
The smart glasses, still in their prototype phase, trace their origins to a brainstorming session about five years ago, said Beryl Tomay, Amazon’s vice president of transportation.
Each year, the team brainstorms big ideas for the company’s delivery system — and during one of those sessions, a question emerged: What if drivers didn’t have to interact with any technology at all?
“The moonshot idea we came up with was, what if there was no technology that the driver had to interact with — and they could just follow the physical process of delivering a package from the van to the doorstep?” Tomay said in an interview. “How do we make that happen so they don’t have to use a phone or any kind of tech that they have to fiddle with?”

That question led the team to experiment with different approaches before settling on glasses. It seemed kind of crazy at first, Tomay said, but they soon realized the potential to improve safety and the driver experience. Early trials with delivery drivers confirmed the theory.
“The hands-free aspect of it was just kind of magical,” she said, summing up the reaction from early users.
The project has already been tested with hundreds of delivery drivers across more than a dozen DSPs. Amazon plans to expand those trials in the coming months, with a larger test scheduled for November. The goal is to collect more feedback before deciding when the technology will be ready for wider deployment.
Typically, Amazon would have kept a new hardware project secret until later in its development. But Reuters reported on the existence of the project nearly a year ago. (The glasses were reportedly code-named “Amelia,” but they were announced without a name.) And this way, Amazon can get more delivery partners involved, get input, and make improvements.
Future versions may also expand the system’s capabilities, using sensors and data to automatically recognize potential hazards such as uneven walkways.
How the technology works
Amazon’s smart glasses are part of a larger system that also includes a small wearable computer and a battery, integrated with Amazon’s delivery software and vehicle systems.
The lenses are photochromatic, darkening automatically in bright sunlight, and can be fitted with prescription inserts. Two cameras — one centered, one on the left — support functions such as package scanning and photo capture for proof of delivery.
A built-in flashlight switches on automatically in dim conditions, while onboard sensors help the system orient to the driver’s movement and surroundings.

The glasses connect by a magnetic wire to a small controller unit, or “compute puck,” worn on the chest of a heat-resistant harness. The controller houses the device’s AI models, manages the visual display, and handles functions such as taking a delivery photo. It also includes a dedicated emergency button that connects drivers directly to Amazon’s emergency support systems.
On the opposite side of the chest, a swappable battery keeps the system balanced and running for a full route. Both components are designed for all-day comfort — the result, Tomay said, of extensive testing with drivers to ensure the wearing the gear feels natural when they’re moving around.
Connectivity runs through the driver’s official Amazon delivery phone via Bluetooth, and through the vehicle itself using a platform called “Fleet Edge” — a network of sensors and onboard computing modules that link the van’s status to the glasses.
This connection allows the glasses to know precisely when to activate, when to shut down, and when to sync data. When a van is put in park, the display automatically activates, showing details such as addresses, navigation cues, and package information. When the vehicle starts moving again, the display turns off — a deliberate safety measure so drivers never see visual data while driving.
Data gathered by the glasses plays a role in Amazon’s broader mapping efforts. Imagery and sensor data feed into “Project Wellspring,” a system that uses AI to better model the physical world. This helps Amazon refine maps, identify the safest parking spots, pinpoint building entrances, and optimize walking routes for future deliveries.
Amazon says the data collection is done with privacy in mind. In addition to the driver-controlled sensor shut-off button, any imagery collected is processed to “blur or remove personally identifiable information” such as faces and license plates before being stored or used.
The implications go beyond routing and navigation. Conceivably, the same data could also lay the groundwork for greater automation in Amazon’s delivery network over time.
Testing the delivery training
In addition to trying the glasses during the event at Amazon’s Delivery Station in Milpitas, Calif., I experienced firsthand just how difficult the job of delivering packages can be.

- Strapped into a harness for a slip-and-fall demo, I learned how easily a driver can lose footing on slick surfaces if not careful to walk properly.
- I tried a VR training device that highlighted hidden hazards like pets sleeping under tires and taught me how to navigate complex intersections safely.
- My turn in the company’s Rivian van simulator proved humbling. Despite my best efforts, I ran red lights and managed to crash onto virtual sidewalks.

The simulator, known as the Enhanced Vehicle Operation Learning Virtual Experience (EVOLVE), has been launched at Amazon facilities in Colorado, Maryland, and Florida, and Amazon says it will be available at 40 sites by the end of 2026.
It’s part of what’s known as the Integrated Last Mile Driver Academy (iLMDA), a program available at 65 sites currently, which Amazon says it plans to expand to more than 95 delivery stations across North America by the end of 2026.
“Drivers are autonomous on the road, and the amount of variables that they interact with on a given day are countless,” said Anthony Mason, Amazon’s director of delivery training and programs, who walked me through the training demos. One goal of the training, he said, is to give drivers a toolkit to pull from when they face challenging situations.
Suffice it to say, this is not the job for me. But if Amazon’s smart glasses live up to its expectations, they might be a step forward for the drivers doing the real work.
