Apple didn’t have a lot to say about Apple Intelligence at last week’s Worldwide Developers Conference, focusing instead on iOS 26 and the new Liquid Display interface that will extend to the iPhone and all of its devices. But even if it had, we’d still be waiting for the new operating systems to be released in the fall to take advantage of them (unless you want to live on the edge and install the first developer betas now).
I sat down to figure out just which of the current Apple Intelligence features I actually use. They aren’t necessarily the showy ones, like Image Playground, but ones that help in small, significant ways. Admittedly, Apple Intelligence has gotten off to a rocky start, from misleading message summaries to delayed Siri improvements, but the AI tech is far from being a bust.
If you have a compatible iPhone — an iPhone 15 Pro, iPhone 16E, iPhone 16 or iPhone 16 Pro (or their Plus and Max variants) — I want to share six features that I’m turning to nearly every day.
More features will be added as time goes on — and keep in mind that Apple Intelligence is still officially beta software — but this is where Apple is starting its AI age.
On the other hand, maybe you’re not impressed with Apple Intelligence, or want to wait until the tools evolve more before using them? You can easily turn off Apple Intelligence entirely or use a smaller subset of features.
Get alerted to priority notifications
This feature arrived only recently, but it’s become one of my favorites. When a notification arrives that seems like it could be more important than others, Prioritize Notifications pops it to the top of the notification list on the lock screen (with a colorful Apple Intelligence shimmer, of course). In my experience so far, those include weather alerts, texts from people I regularly communicate with and email messages that contain calls to action or impending deadlines.
To enable it, go to Settings > Notifications > Prioritize Notifications and then turn the option on. You can also enable or disable priority alerts from individual apps from the same screen. You’re relying on the AI algorithms to decide what gets elevated to a priority — but it seems to be off to a good start.
Apple Intelligence summarized two text messages.
Sometimes summaries are vague and sometimes they’re unintentionally funny but so far I’ve found them to be more helpful than not. Summaries can also be generated from alerts by third-party apps like news or social media apps — although I suspect that my outdoor security camera is picking up multiple passersby over time and not telling me that 10 people are stacked by the door.
Summarize long articles in Safari in the Reader interface.
Siri gets a glow-up and better interaction
I was amused during the iOS 18 and the iPhone 16 releases that the main visual indicator of Apple Intelligence — the full-screen, color-at-the-edges Siri animation — was noticeably missing. Apple even lit up the edges of the massive glass cube of its Apple Fifth Avenue Store in New York City like a Siri search.
Instead, iOS 18 used the same-old Siri sphere. Now, the modern Siri look has arrived as of iOS 18.1, but only on devices that support Apple Intelligence. If you’re wondering why you’re still seeing the old interface, I can recommend some steps to turn on the new experience.
Double-tap the bar at the bottom of the screen to bring up a voice-free Siri search.
On a Mac, go to System Settings > Apple Intelligence & Siri and choose a key combination under Keyboard shortcut, such as Press Either Command Key Twice.
Yes, this involves more typing work than just speaking conversationally, but I can enter more specific queries and not wonder if my robot friend is understanding what I’m saying.
Remove distractions from your pictures using Clean Up in the Photos app
Until iOS 18.1, the Photos app on the iPhone and iPad lacked a simple retouch feature. Dust on the camera lens? Litter on the ground? Sorry, you need to deal with those and other distractions in the Photos app on MacOS or using a third-party app.
Now Apple Intelligence includes Clean Up, an AI-enhanced removal tool, in the Photos app. When you edit an image and tap the Clean Up button, the iPhone analyzes the photo and suggests potential items to remove by highlighting them. Tap one or draw a circle around an area — the app erases those areas and uses generative AI to fill in plausible pixels.
The Reduce Interruptions Focus mode (left) intelligently filters possible distractions. Turn it on in Control Center (middle). When something comes in that might need your attention, it shows up as a notification marked Maybe Important (right).
For more on Apple Intelligence features, check out how to create Genmoji, how to use Image Wand and, if you want to scale things back, how to disable select Apple Intelligence features.