Apple’s AI strategy might be taking a very familiar turn, one that made the iPhone what it is today. As per Bloomberg’s recent report, Apple is working on a new “Extensions” system in iOS 27 that would allow third-party AI assistants to plug directly into Siri, including services like Google Gemini and Anthropic’s Claude.
More importantly, this won’t just be a hidden setting. Instead, Apple is reportedly planning a dedicated section inside the App Store for these AI integrations, effectively creating a marketplace for AI tools, very similar to how apps are distributed today.
What does this actually mean for Siri?
It’s a pretty massive shift. Instead of trying to build one perfect AI, Apple seems to be turning Siri into a hub or “router” for multiple AI models, letting users choose which assistant handles their queries. That means Siri could act as the front-end, while different AIs handle different tasks, one for writing, another for coding, another for research. It’s less “Siri vs ChatGPT” and more “Siri + everything.”

As things stand, Apple is reportedly pursuing a two-pronged strategy: building its own in-house AI (Apple Intelligence), while also opening the door to third-party services. This lets Apple stay competitive without relying on just one model. It also keeps users from jumping ship to Android.

There’s also a business angle here. By turning AI tools into something users can install via the App Store, Apple could take a cut of subscriptions, just like it does with apps today.
So… is Siri becoming the new App Store?
This could completely change how AI works on phones. Instead of relying on one assistant to do everything, Apple seems to be moving toward a modular setup where users can mix and match different AI tools based on what they need. And if this vision plays out, Siri won’t just be an assistant anymore, but a platform.





