To relaunch the Siri machine, the Cupertino firm announced during in June 2024, on the occasion of WWDC, a real brain transplantation. The assistant had to integrate new, more efficient AI models developed internally … which clearly did not reveal themselves so efficient! The company has indeed postponed this new Siri to next year, when it was more or less heard that it comes out in spring 2025.
The intelligence of outsourced Siri?
Since then, it has been a hassle internally. Siri’s development is in new hands, in this case those of Craig Federighi, the vice-president in charge of the software, and Mike Rockwell, who became “chief” of the assistant after managing the launch of the pro vision. The time of large maneuvers being finished, the two eggs have embarked on a rescue operation and looked for solutions … which could go through external AI models.
Read “ugly” and “embarrassing”: Siri’s delay plunges Apple in the crisis
Apple develops two types of AI models: those that operate locally on devices, and take care of treating simple functions like Genmoji (emojis combinations) and email summaries. The others run on Private Cloud Compute servers developed by the manufacturer to protect data confidentiality. The new SIRI and its personalized functions must rely on these online models, which still do not seem to the competition.
This is why Federighi and Rockwell would have taken their pilgrim’s stick to ask Openai and Anthropic to test their GTP and Claude models on Apple servers, according to Bloomberg. And it is the latter that would give the best results. In other words, the brain of the future Siri could be the same as that of the Bot Claude! But Anthropic, feeling the pretty blow, would ask for billions of dollars each year at Apple who, as everyone knows, is very close to his money.
However, this would not be the first time that the IT manufacturer has been based on an external AI model for Apple Intelligence functions. Chatgpt is thus used to answer questions left fallow by Siri in iOS 18. But here, we are talking about particularly important functions. Siri 2.0 must be able to “read” the iPhone screen, draw into apps data and “understand” the context in which a user asks a question.
Functions that a model like those of Anthropic or Openai can perform and that Apple may not have the choice to take care to rectify the Siri bar. This integration could nevertheless cause a nice internal bazaar: Bloomberg Has heard of a potential mutiny teams responsible for AI development, who do not want to be pointed out for the problems of the assistant.
🟣 To not miss any news on the Geek newspaper, subscribe to Google News and on our WhatsApp. And if you love us, .