Some models of Apple’s popular AirPods may soon be able to do live, in-person language translations when you squeeze both stems at the same time.
According to an image posted by websites including 9to5Mac, the touch gesture is featured in a system asset that’s part of Apple iOS 26 developer beta 6. In the image, the gesture is shown on a pair of AirPods with words in languages including English, Spanish, German, French and Portuguese. In June, Apple showed off AI-powered live translation features it plans to roll out for apps including Messages, FaceTime and Phone.
But the company did not specifically mention live translation on the AirPods, even though reports from as early as March suggested that was in the works. According to 9to5Mac, the feature is likely to work with the AirPods Pro 2 and AirPods fourth-gen models. It’s unclear if live translations would be available across Apple devices or if it would be exclusive to new product models, such as the company’s upcoming iPhone 17 lineup.
Apple didn’t immediately respond to request for comment.
What it means for Apple users
One scenario, if Apple chooses to make live translation features available sooner rather than later, would be to make it a marquee Apple Intelligence feature of iPhone 17, which is expected to debut in September. If, as some have speculated, the translation work would be done on that device, rather than processed on the AirPods themselves, it’s feasible that Apple could launch a software update then to make translations available on some models of AirPods.
The selling point might be Apple touting iPhone 17 as its first Apple device with the AI features and hardware capable of such a magical feat.
However, that might also open the company up to criticism: Samsung phones have had the ability to live-translate phone calls since 2024’s Galaxy S24. And they’re not the only ones. Meta’s Ray-Bans smart glasses also have live translation features, as have some other gadgets, such as Pixel Buds dating back to 2017.