Ever since generative AI chatbots arrived on the scene, it was obvious that virtual assistants living on our phones will never be the same again. Google kicked off the trend with Gemini, and it seems AI is finally giving some much-needed boost to Siri, as well.
At WWDC 2024, Apple introduced a new avatar of Siri. Borrowing from the skillsets of its rivals, the iPhone’s native AI assistant is finally getting more conversational, alongside superpowers such as summarization and app direct integrations with apps.
For example, Siri will now understand your body gestures, such as head movements. To interact with Siri, you can now nod your head yes to accept an incoming voice call, handle messages, or interact with notifications. Apple says these capabilities are powered by machine learning wizardry facilitated by the onboard H2 chip.
Apple’s take on AI is called Apple Intelligence, which the company says puts the focus on power, intuitiveness, deep product integration, contextual awareness, and privacy. The motto? “Intelligence that understands you.”, says the company.
Apple is touting natural language understanding for Siri, among other products. The onboard AI can rewrite and summarize text content across in-house apps such as Mail and apps. Apple is also focusing on image generation, allowing users to create original media in sketch, illustrations, and animation styles across apps like messages and Freeform, among others.
And since the focus is on privacy, a healthy share of these AI chores are handled on-device. That means your data never leaves your phone for cloud processing, since all of that happens natively. For more demanding workflows, Apple is touting Private Cloud Compute, a cloud infrastructure built atop Apple silicon that offers the same level of data privacy as the Apple device in your hands.
Editors’ Recommendations