Here’s how iOS 18 could change the way you use your iPhone

The lock screen on the Apple iPhone 15 Plus.
Andy Boxall / Digital Trends

It seems the long-overdue Siri overhaul will finally arrive at WWDC in just over a week from now, and the digital assistant will embrace AI trickery in all its forms. According to Bloomberg, Apple’s planned upgrades for Siri will deeply integrate with on-device functions at the OS level and with the installed apps, too.

“The new system will allow Siri to take command of all the features within apps for the first time,” the report says. The most notable capability is that Siri will only require voice prompts to interact with apps, thanks to a major change in the AI architecture powering it and putting large language models in command, just the way Gemini or ChatGPT draw their own skills from such models.

Letting Siri control iPhone and iPad apps is reportedly Apple’s core focus, and something Siri has been lagging far behind at when compared to Google Assistant on Android phones. The next-gen Siri will let users open documents, move notes across different folders, email web links, or access a specific outlet in the Apple News app.

AI assistants compared with ChatGPT.
Can Siri finally catch up with AI powering it? Nadeem Sarwar / Digital Trends

Siri will start by accepting single natural language commands at a time, but down the road, Apple will let it handle a chain of multistep voice prompts as well. This is one of the biggest upgrades that would make Siri interactions more rewarding. Google Assistant already allows this chained voice query, such as “Hey Google, dim the lights and play music.”

In Siri’s case, it would be able to accomplish tasks like editing pictures and sending them to a contact or summarizing a note and emailing it with a single voice command. Some of the planned Siri enhancements will be rolled out in a phased manner instead of arriving all at once.

Siri will run the show, one voice cue at a time

Enabling Siri interface
Siri will soon talk and work with native apps on your iPhone. Nadeem Sarwar / Digital Trends

Taking a cue from what Microsoft’s Copilot and Gemini are promising, Siri will also summarize articles for users with a voice prompt. “The new system will go further, using AI to analyze what people are doing on their devices and automatically enable Siri-controlled features,” adds the Bloomberg report.

In the early days, Siri’s integrations will be limited to Apple’s in-house apps, but the company already has the infrastructure ready for developers to integrate Siri smarts into their own apps. How exactly AI upgrades fit into the picture, we’ll get to know more at WWDC 2024 in the coming days.

Apple has more system-level AI tweaks planned for iOS 18 than just reimagining Siri. Notably, the company will push the bulk of AI processing to the iPhone’s silicon, and only a minor portion of it will be pushed to the cloud.

Google has been marketing processing as being on-device, allowing the Gemini Nano model to run natively on the Tensor silicon inside the Google Pixel 8 series phones for tasks like summarization, transcription, and smart reply. And it appears that Apple’s new AI tricks won’t stray too far from what Google has showcased for Android phones.

Another notable upgrade that Apple has developed is related to notifications, something we haven’t seen so far. Using AI, the company aims to offer summarized recaps of notifications. This doesn’t just cover app notifications, call alerts, and messages — it will also include support for documents, news alerts, news articles, and webpages with different media formats.