When Apple announced Apple Intelligence at WWDC, the demos looked great and the actual rollout was rough. Features were missing on launch, the smarter Siri got delayed twice, and a lot of the headline tools needed an iPhone 15 Pro or newer just to run.
Eighteen months later the picture looks different. Apple shipped most of what it promised, smoothed out the rough edges, and added a few things that were not in the original announcement. People who upgraded to an iPhone 16 are using these features daily without thinking about it. People still on older phones are mostly locked out.
What works well now
The writing tools are the most obvious win. Mail, Notes, Messages, and any text field across the system can now rewrite, proofread, summarize, or change the tone of what you typed. The output quality is not at GPT 4 level for long content, but for the short stuff most people write on a phone, it is genuinely useful.
A common pattern that emerged is using the Make Friendly tone option to soften work emails sent from a tired thumb on the subway. Another is summarize on long messages from group chats, where Apple Intelligence will give you a one line version of what you missed.
Notification summaries also got better. Early versions famously botched news headlines and Apple paused the feature for a while. The version that came back is more conservative, less likely to hallucinate, and works well for personal messages and group chats. For news, Apple just removed the feature instead of trying to fix it. That was the right call.
Image cleanup, which lets you tap a person or object to remove from a photo, is now reliable enough to actually use. Earlier versions left visible artifacts. The current version handles most everyday cleanups, like a stranger photobombing the background, in a way that holds up at full screen.
The new Siri
The big delayed feature, the rebuilt Siri with App Intents, is now live and it changes the calculus.
You can ask Siri to do things across apps in a single sentence. Send the photo I took yesterday at the beach to my brother on WhatsApp. Find the email from Sarah about the rental car and add the pickup time to my calendar. Pull up the photo I took of that wine label and search for it on Vivino.
This worked in demos but failed in real use for the first six months after launch. Apple shipped a major update that retrained the model on a wider corpus of app actions, and the success rate is now genuinely high. Maybe one in ten requests still misfires. That is good enough that people are using it instead of opening the app.
The catch with Siri is the same as with most of Apple Intelligence. It needs the latest chip. iPhone 14 and earlier cannot run it. iPhone 15 base model cannot either. You need a 15 Pro, a 16, or a 16 Pro to get the full feature set.
What is still rough
Image generation is the weakest part. The Genmoji feature for making custom emojis is fun. Image Playground for generating actual images is mostly bad. The output looks like 2023 era diffusion models, with the same uncanny smoothness and weirdly proportional faces. Apple seems to know it is behind here. They have been quietly testing a major update that should ship in the next few months.
ChatGPT integration, where Siri hands off complex questions to OpenAI, works but feels stitched on. The handoff is visible. Siri tells you it is going to ask ChatGPT, you confirm, and you get the answer. It is fine, but anyone who already has the ChatGPT app installed is mostly just using that directly.
Visual Intelligence, the Google Lens style feature where you point your camera at something and ask about it, is improving but not at parity with Google. For text translation it is great. For identifying random objects, plants, or restaurant menu items, Google still wins.
The hardware lock in problem
Here is the part that frustrates a lot of users. Apple is gating its best AI features to recent hardware in a way that feels stricter than competitors.
A Pixel 7, which came out in 2022, runs a meaningful chunk of Google Gemini Nano features. A Galaxy S22 from 2022 runs Samsung Galaxy AI features. An iPhone from 2022, the iPhone 14, runs almost none of Apple Intelligence.
Apple's argument is that the on device models need a Neural Engine that older chips do not have. That is technically true. But it also means that anyone holding onto an iPhone 13 or 14, which is most iPhone users globally, has to upgrade to even sample the new features.
For Apple this is not a problem. It is the opposite of a problem. The features are creating real upgrade pressure. Carriers are bundling Apple Intelligence into their iPhone 16 promos. Trade in values for older iPhones got a slight bump because of demand from people upgrading.
For users, especially in markets where new iPhones are very expensive, the gap between what your phone can do and what the new ones can do is now noticeable in daily use.
Should you upgrade
If you are on an iPhone 15 Pro, you already have most of this. No reason to upgrade for AI alone.
If you are on iPhone 14 or older and you use AI tools daily for work, the iPhone 16 is the first iPhone in years where the AI is a legitimate reason to consider an upgrade. Especially if you are coming from an iPhone 13 or older, the Camera Control button, the better display, and the AI features compound into a real difference.
If you mostly use your phone for messaging, calls, and social media, none of this matters yet. The features that benefit casual users, like the writing tools and notification summaries, are nice but not essential. Stick with what you have until your phone breaks.
Where this is going
The interesting question is what happens when Apple opens up Apple Intelligence to third party developers in a deeper way. Right now developers can plug into App Intents to make their apps work with Siri. The next layer, expected in the next major iOS, will let developers train smaller specialized models that run alongside Apple Intelligence on the device.
If that ships and works, your apps will start to feel a lot smarter without sending data to the cloud. That is the version of mobile AI that has been promised for years and never quite arrived.
For now, Apple Intelligence is finally what it was supposed to be at launch. Useful, fast, and quietly integrated into the things you already do. The only thing missing is the iPhone you would need to actually use it.


