In a move to position the Apple Watch as more of an AI wearable, Apple is working on multiple versions of future Apple Watch models that include cameras. This will help the device “see the outside world,” per Bloomberg’s Mark Gurman.
This AI push would go hand in hand with Apple’s existing Visual Intelligence technology, which the company also intends to bring to AirPods. Right now, Visual Intelligence heavily relies on ChatGPT and Google. According to Gurman, though, the company wants to bring this work in-house:
Apple’s ultimate plan for Visual Intelligence goes far beyond the iPhone. The company wants to put the feature at the core of future devices, including the camera-equipped AirPods that I’ve been writing about for several months. Along the way, Apple also wants to shift Visual Intelligence toward its own AI models, rather than those from OpenAI and Google.
As for Apple Watch models with cameras, Gurman reports that Apple is working on it for both standard and ultra Apple Watch models. With the standard Apple Watch, the camera would be embedded within the display, similar to an iPhone. It’s unclear if this would utilize under display-tech, or if this’ll necessitate a camera cutout.
With the Ultra, the company plans on embedding it on the side of the watch, next to the digital crown and side button. Apple is likely doing this because they have more space to work with. Gurman says that Ultra users would have an easier time pointing their wrist at things to scan.
This Apple Watch wouldn’t watch until at least 2027, according to Gurman, alongside the rumored AirPods with cameras. This all hinges on Apple’s AI teams getting things in order. The team recently underwent an executive shake-up.
My favorite Apple accessories on Amazon:
Follow Michael: X/Twitter, Bluesky, Instagram
FTC: We use income earning auto affiliate links. More.