Apple Intelligence acts as a personal AI agent across all your apps

Craig in front of a screen reading Apple Intelligence
Apple

During last year’s Worldwide Developers Conference (WWDC) keynote address, Apple executives mentioned the phrase “AI” exactly zero times. Oh, what a difference a year makes. At WWDC 2024 on Monday, Senior Vice President of Software Engineering Craig Federighi revealed Apple Intelligence, a new AI system “comprised of highly capable large language and diffusion models specialized for your everyday tasks” that will impact and empower apps across the company’s lineup of devices.

“This is a moment we’ve been working towards for a long time,” Federighi said. “Apple intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac. It draws on your personal context to give you intelligence.” The machine learning system promises to enable your mobile and laptop devices to “understand and create language, as well as images, and take action for you to simplify interactions across your apps.”

For example, Apple Intelligence will allow your iPhone to prioritize specific system notifications to minimize distractions while your focus on a task. New AI writing aides can proofread your works, rewrite them upon command, and summarize text for you. Those will be available across a variety of system apps including Mail, Notes, Safari and Pages — and third-party apps as well — Federighi explained. 

What’s more, Apple Intelligence can leverage its computer vision capabilities to create entirely new images from photos already on the camera roll. For example, “when you wish a friend a happy birthday, you can create an image of them surrounded by cake, balloons, and flowers to make it extra festive,” Federighi said. “And the next time you tell Mom that she’s your hero, you can send an image of her in a superhero cape to really land your point.” The user can even choose between a trio of artistic genres in which to display their generated works: sketch, illustration, and animation. 

Apple Intelligence’s biggest feature, however, will be its ability to interact with the various apps across a device on the user’s behalf, leveraging the user’s personal data to streamline everyday actions. For example, users will be able to find photos of specific groups and individuals within their camera roll simply be describing the shot and the people in it. Or, rather than digging through your email or messages to find the file a co-worker previously shared, users can simply say, “pull up the files that Joz shared with me last week.”

The system, is “grounded in your personal information and context, with the ability to retrieve and analyze the most relevant data from across your apps, as well as to reference the content on your screen,” Federighi said. It’s what allows the system to accurately predict whether a rescheduled business meeting might prevent the user from being late to attend their child’s dance recital. As Federighi illustrated, Apple Intelligence will “understand who my daughter is, the play details she sent several days ago, the time and location for my meeting, and predicted traffic between my office and the theater.”

Some users might blanch at the prospect of providing Apple Intelligence with that degree of access to their (and their children’s) personal data, Apple has taken extraordinary steps to ensure that information stays private. Most of Apple Intelligence’s operations happen on-device, powered by the company’s latest generations of A17 and M-family processors, Federighi said. “It’s aware of your personal data, without collecting your personal data,” he added.

Any operations that do need to be performed in the cloud will be done on Apple’s cloud compute data centers running Apple silicon. So rather than using the public clouds of hyperscalers like Google Cloud, Microsoft Azure, or Amazon’s AWS, Apple went out and built its own private data silo to handle just these machine learning compute requests.    

“When you make a request, Apple Intelligence analyzes whether it can be processed on-device,” Federighi explained. “If it needs greater computational capacity, it can draw on private Cloud compute and send only the data that’s relevant to your task to be processed on Apple silicon servers.”

“Your data is never stored or made accessible to Apple,” he continued. “It’s used exclusively to fulfill your request and, just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise.”  In fact, he explained, a user’s mobile or laptop device won’t even connect to a server unless its software has been publicly logged for expert inspection.

While other companies in the burgeoning AI space have been scrambling to incorporate machine learning operations into their existing products and release them to the public as quickly as possible (with occasionally disastrous results), Apple has taken a far more measured approach toward developing and distributing its own AI capabilities.

“We continue to feel very bullish about our opportunity in generative AI and we’re making significant investments,” Apple CEO Tim Cook told Reuters in a May interview. He was also quick to point out that Apple has spent $100 billion on AI research and development over the past five years.

Although AI wasn’t mentioned directly in last year’s keynote, the company did roll out a number of machine learning-enhanced features during WWDC 2023. Those include the Lock Screen’s live video, “ducking autocorrect,” the Journal app’s personalized writing prompts, the Health app’s myopia test, and the AirPods’ ability to tune playback settings based on prevailing environmental conditions, among others. Though it’s gone by a different name in the past, this is clearly not Apple’s first rodeo.

Apple Intelligence will be available to try later this summer.