Apple Intelligence has had a rocky rollout, but Cupertino’s AI plans are more ambitious than a revamped Siri. As Bloomberg reports, Apple is planning to put cameras in its wearables by 2027, so you can talk to Siri about what you see, starting with AirPods and the Apple Watch.
This Visual Intelligence effort started with the iPhone 16’s Camera Control button, which opens the Camera app and offers an AI-powered explanation of what it sees. Tell it to translate a sign or ask ChatGPT about a landmark in the park, for example.
Apple’s plans for Visual Intelligence go “far beyond the iPhone,” according to Bloomberg’s Mark Gurman, and will eventually be “at the core of future devices.” It’s all about a future where interacting with AI is more hands-free and integrated naturally into your life. No need to look at a screen, type to a chatbot, or upload a photo.
Apple Vision Pro (Credit: Joseph Maldonado)
It’s a big gamble, and recent events suggest Apple is struggling to achieve it. While the Vision Pro has top-notch tech, it’s expensive and uncomfortable to wear for long stretches. Last we heard, Apple was exploring a more affordable Vision Pro to compete with Meta’s VR lineup, though Apple Intelligence is coming to the headset next month.
The company’s lofty ambitions to revamp Siri into a more humanlike, knowledgeable assistant have been delayed multiple times; it too might not arrive until 2027. Apple Intelligence’s lackluster debut prompted a lawsuit claiming false advertising, and Tim Cook has shuffled the execs leading Apple’s AI team. Vision Pro creator Mike Rockwell is now in charge of Siri; can he create a more consumer-friendly version of the Vision Pro with a new Siri baked in?
Recommended by Our Editors
Meta Ray-Ban glasses (Credit: Andrew Gebhart)
While the idea of earbuds with cameras is innovative, Apple is playing catch-up with this concept of camera-based intelligence. Meta has already found success with its Ray-Ban glasses and has lofty ambitions to evolve them. Google is also further along. It has offered Lens for years and is moving ahead with Project Astra; it recently released screen-reading for its Gemini AI.
Amazon, meanwhile, previewed Alexa+ last month, which promises more natural conversations, with the ability to discuss what it sees on camera. But like Siri, Alexa+ was also delayed several times before it was announced. It’s not easy to create fluid, hands-free assistants with visual capabilities. Will they be worth the wait?
Apple Fan?
This newsletter may contain advertising, deals, or affiliate links.
By clicking the button, you confirm you are 16+ and agree to our
Terms of Use and
Privacy Policy.
You may unsubscribe from the newsletters at any time.