The long-awaited launch of the new AI-powered Siri now looks much closer thanks to Apple’s partnership with Google. The company this week confirmed reports that many Siri features will be powered by Google’s Gemini models.
We already knew some of the features we could expect from AI Siri thanks to the announcements at WWDC 2024 and a now-deleted iPhone 16 ad – and the launch of Gemini Personal Intelligence has now effectively provided a working preview …
Drawing personal information from your apps
Google yesterday launched a beta version of what it calls Personal Intelligence. The headline feature here is Gemini’s ability to use a complex mix of sources to generate responses, including personalized information pulled from a number of the Google apps and services people use.
Personal Intelligence can retrieve specific details from text, photos, or videos in your Google apps to customize Gemini responses. This includes Google Workspace (Gmail, Calendar, Drive, etc), Google Photos, your YouTube watch history, and of all of the various Google search services you’ve used (Search, Shopping, News, Maps, Google Flights, and Hotels).
The Apple version will of course pull information from Apple apps like Mail, Calendar, Photos, and Notes.
Google’s Gemini head Josh Woodward gave an example of how Personal Intelligence had helped him.
We needed new tires for our 2019 Honda minivan two weeks ago. Standing in line at the shop, I realized I didn’t know the tire size. I asked Gemini.
These days any chatbot can find these tire specs, but Gemini went further. It suggested different options: one for daily driving and another for all-weather conditions, referencing our family road trips to Oklahoma found in Google Photos. It then neatly pulled ratings and prices for each.
As I got to the counter, I needed our license plate. Instead of searching for it or losing my spot in line to walk back to the parking lot, I asked Gemini. It pulled the seven-digit number from a picture in Photos and also helped me identify the van’s specific trim by searching Gmail. Just like that, we were set.
Addressing the hallucination problem
Hallucinations are one of the greatest dangers with AI systems, and the risks obviously increase if they are trying to extrapolate from your own personal data. Google says the new feature allows you to see exactly what assumptions it is making and offer you the chance to verify or correct those.
You also won’t have to guess where an answer comes from: Gemini will try to reference or explain the information it used from your connected sources so you can verify it. If it doesn’t, you can ask it for more information. And if a response feels off, just correct it on the spot (“Remember, I prefer window seats”).
You’ll also have the option of telling it to give you a new answer without personalization.
Privacy protection
Google says that you have to opt in to Personal Intelligence and that you decide which apps to include.
Connecting your apps is off by default: you choose to turn it on, decide exactly which apps to connect, and can turn it off anytime.
Additionally, the company says that your data never leaves the Google ecosystem. That may or may not reassure you, but obviously the Apple version will only access data sitting within your apps and inside your iCloud account.
Video run-through
Our sister site 9to5Google provides an 8-minute video which runs through all of the new features.
What are your thoughts? For our purposes, assume everything run by Google is instead run by Apple, as that will be the reality for the Gemini-powered Siri. It will access your Apple apps, use your iCloud data, and run either on-device or on Apple’s Private Cloud Compute servers.
Image: Google


FTC: We use income earning auto affiliate links. More.

