Sheejith's Personal Site

Apple’s new Siri will secretly use Google Gemini models behind the scenes

Apple has landed on its strategy for the new Siri update coming as soon as iOS 26.4 in the spring of next year. Behind the scenes, much of the new Siri experience will use Google Gemini models.

The custom Gemini model will run on Apple’s Private Cloud Compute servers, to help fulfil user requests. Apple has promised that the new Siri will be able to answer personal questions like ‘find the book recommendation from Mom’ by hunting through data on your device and generating the appropriate response on-the-fly.

As previously reported by Bloomberg, the new Siri architecture will have three distinct components; a query planner, a knowledge search system, and a summarizer. Google Gemini models will run on Apple’s servers and provide planner and summarizer capabilities.

User privacy will be preserved by running the Google models on Apple’s server infrastructure without any external data sharing, and on-device personal data will likely be processed using Apple’s own Foundation Models.

The new knowledge search component may also be powered using Gemini models. It will enable Siri to have ‘understanding’ about world topics and trivia, and allow Apple’s voice assistant to answer more general knowledge questions without resorting to third-party integrations like the mediocre ChatGPT integration, or falling back to basic ‘I found this on the web’ search results.

Posted on: 11/3/2025 12:08:31 PM


Talkbacks

You must be logged in to enter talkback comments.