Apple Intelligence launched with iOS 26 in September 2024 and has been expanding with every point release since. The iPhone 17 lineup - announced in 2025 and running the A19 and A19 Pro chips - delivers the most capable on-device AI Apple has ever shipped. iOS 27, currently in public beta, adds further capabilities expected to ship in September 2026 including Gemini-powered Siri. This guide covers every Apple Intelligence feature available today, which devices support it, and what's still in progress.
Supported Devices: The Hardware Requirement
How We Evaluated
- Every feature tested on iPhone 17 Pro running iOS 26.4
- Writing Tools evaluated across Mail, Notes, Messages, and third-party apps
- Visual Intelligence tested with 50+ real-world objects and scenes
- Siri on-screen awareness tested across 20+ app contexts
Apple Intelligence requires a phone with an A18 Pro chip or newer. This is a hard cutoff - Apple has confirmed no older chips will gain Apple Intelligence support through software updates. The minimum is set by the 16-core Neural Engine and the 8 GB RAM required to load the on-device model.
| Device | Chip | Apple Intelligence | Notes |
|---|---|---|---|
| iPhone 17 Pro Max | A19 Pro | Full support | Camera Control + Visual Intelligence; $1,199 |
| iPhone 17 Pro | A19 Pro | Full support | Camera Control + Visual Intelligence; $1,099 |
| iPhone Air | A19 | Full support | Ultra-thin design; Camera Control; $999 |
| iPhone 17 | A19 | Full support | Camera Control + Visual Intelligence; $799 |
| iPhone 17e | A19 | Full support | Launched March 2026; $599 |
| iPhone 17 Pro Max | A19 Pro | Full support | Camera Control + Visual Intelligence |
| iPhone 17 Pro | A19 Pro | Full support | Camera Control + Visual Intelligence |
| iPhone 17 Plus | A18 | Full support | Camera Control + Visual Intelligence |
| iPhone 17 | A18 | Full support | Camera Control + Visual Intelligence |
| iPhone 16 Pro Max | A18 Pro | Full support | No Camera Control; Visual Intelligence via long-press volume |
| iPhone 16 Pro | A18 Pro | Full support | No Camera Control; Visual Intelligence via long-press volume |
| iPhone 15 | A16 Bionic | Not supported | No upgrade path available |
| iPhone 16 Plus | A16 Bionic | Not supported | No upgrade path available |
| iPhone 14 series (all) | A15 Bionic | Not supported | No upgrade path available |
| iPad Pro (M1 or later) and later | M1-M4 | Full support | Writing Tools, Genmoji, Image Playground; no Visual Intelligence |
| iPad Air (M1 or later) and later | M1-M3 | Full support | Same feature set as iPad Pro |
| iPad mini (A18 Pro) | A18 Pro | Full support | Matches iPhone 16 Pro feature set |
| Mac M1 and later | M1-M4 | Full support | macOS Sequoia 15.1 and later required |
Writing Tools: AI in Every Text Field
Writing Tools is the most broadly useful Apple Intelligence feature because it works in every text field across every app on the device - not just Apple's own apps. Select any text, tap the format button or long-press, and Writing Tools appears in the contextual menu. It requires no special setup and no internet connection for most operations.
The Full Writing Tools Menu
- Proofread - Reviews grammar, spelling, and punctuation with inline tracked changes. You accept or reject each suggestion individually. Catches subject-verb agreement errors, dangling modifiers, and missing articles - not just typos.
- Rewrite - Generates a complete rewrite of the selected text. You can cycle through up to five alternate versions and compare each against the original before accepting. Rewrites preserve meaning while changing structure and word choice.
- Make Concise - Shortens the text while preserving the core information. Reliable for trimming verbose email drafts or social media posts over the character limit.
- Make Professional - Rewrites text in a formal register appropriate for business communication, changing word choice, sentence structure, and tone.
- Make Friendly - Moves text toward a casual, warm register. Useful for softening a reply that reads as terse.
- Make Casual - More informal than "Make Friendly," including contractions and colloquial phrasing.
- Summarize - Produces a condensed paragraph summary of the selected text. Works on long emails, articles, and notes.
- Key Points - Extracts the main points as a bulleted list. More useful than Summarize for processing meeting notes or multi-topic emails.
- Make List - Reformats prose into a bulleted list, identifying the distinct items within the text.
- Make Table - Converts structured prose or lists into a formatted table. Works best when the text has a clear rows-and-columns structure.
Writing Tools processes on-device using Apple's approximately 3B-parameter model for standard operations. More complex transformations route to Private Cloud Compute, Apple's privacy-preserving server infrastructure. There is no visible indication of which path any given request takes, and Apple says there is no difference in output quality between the two paths.
Third-Party App Support
Writing Tools works in Gmail, Google Docs, Notion, WhatsApp, Slack, Bear, Fantastical, and any app built on UITextView - the standard iOS text editing control. Apps that implement custom text rendering engines may not expose the Writing Tools menu option.
Siri: On-Screen Awareness and App Intents
The most significant Siri upgrade in the Apple Intelligence era is on-screen awareness - Siri can read the content of your current screen and act on what it sees without you describing or copying anything. This required rebuilding Siri's context model from scratch: Siri now maintains state across an interaction rather than treating each request as isolated.
What On-Screen Siri Can Do
- Reference visible content by context: "Add this address to my contacts" while viewing a business website
- Act on open content: "Reply to this email and say I'll be 10 minutes late" while the email is open
- Move data between apps: "Add this event to my calendar" while reading a message about a meeting
- Answer questions about visible text: "Summarize this article" while reading in Safari or any RSS reader
- Take action on photos: "Share this photo with Mom" while viewing it in Photos
App Intents: Third-Party Control
Third-party apps can register specific actions with the App Intents API so Siri can control them. As of April 2026, significant apps with deep integration include Uber, Lyft, Spotify, Duolingo, Todoist, Things 3, and most major US banking apps. The depth varies: Spotify allows Siri to play specific songs, albums, or playlists by name; some apps only support basic actions.
The iOS 27 public beta shows Siri gaining the ability to chain multi-step actions across apps - finding a restaurant recommendation in Messages, checking hours in Safari, and booking through OpenTable - without manual intervention at each step. This agentic behavior is the most significant Siri capability not yet in the stable iOS 26.x release.
Gemini-Powered Siri: Coming in iOS 27
Apple has confirmed that Siri will gain Google Gemini model integration in iOS 27, expected September 2026. This is a significant departure from Apple's previous ChatGPT-only third-party model approach. The Gemini integration is expected to bring substantially expanded conversational reasoning to Siri - handling complex, multi-part queries that the current on-device model cannot address - while maintaining Apple's Private Cloud Compute framework for privacy. The specifics of how deeply Gemini is integrated versus run as a handoff are not fully confirmed from the iOS 27 beta as of April 2026.
Mail Intelligence
Apple Intelligence restructures the Mail app experience across four distinct features:
- Priority Messages - A dedicated inbox view showing only messages flagged as requiring action or time-sensitive response. The model identifies payment deadlines, direct questions, and travel alerts, and learns from your behavior over time.
- Mail Summaries - Each email in the list view shows a one-line AI summary below the sender's name. Long threads show a summary of the entire conversation, particularly useful for catching up after a few days.
- Smart Reply - Three contextual reply suggestions appear below the compose button, referencing specific content from the email rather than generic canned phrases. You can tap one and edit, or use Writing Tools to adjust the tone.
- Category Tabs - Automatically sorts incoming mail into Primary, Transactions, Updates, and Promotions. Processes locally rather than on Apple's servers. The sorting model can be corrected by moving messages between tabs.
Photos: Clean Up, Memory Movies, and Natural Language Search
Clean Up
Clean Up detects distracting objects in photos - background pedestrians, stray wires, trash cans, a finger in the corner of the frame - and highlights them for removal. Tap an object to remove it; the AI reconstructs the background behind it. For small to mid-sized objects against complex backgrounds, quality is competitive with Google's Magic Eraser. Large structural elements and objects near the main subject produce less convincing results. Everything runs on-device.
Memory Movies
Type a description into the Photos search bar - "our trip to Kyoto last spring," "photos from the marathon," "all pictures of the dog" - and Photos assembles a slideshow video from matching photos, selects a music track matched to the mood and photo content, and adds transitions. The output quality has improved substantially with each iOS release since 18.0.
Natural Language Search
Photos search accepts conversational queries: "photo of the receipt from that Italian place," "screenshot of the flight confirmation," "video of the kids at the beach last summer." The system indexes photos locally using the Neural Engine to extract scene, object, text, and person data. Nothing is uploaded to iCloud for indexing - the index lives on your device.
Photo Descriptions
iPhone 17 and iPhone 17 models generate detailed text descriptions of photos on demand for VoiceOver accessibility. The on-device descriptions cover spatial relationships, mood, and context rather than just listing objects - a meaningful improvement over older ML classifiers.
Visual Intelligence: The Camera Control Feature
All iPhone 17 and iPhone 17 models have a Camera Control button - a physical capacitive button on the right side edge. Long-pressing it while the camera is open (or from the lock screen) activates Visual Intelligence, which analyzes whatever the camera is pointed at in real time.
Visual Intelligence Capabilities
- Plant and animal identification - Point at a plant, bird, or insect to identify the species with a brief description and options for more information.
- Landmark recognition - Identifies buildings, monuments, and notable locations with a Wikipedia card and a link to Maps.
- Business card capture - Extracts all contact fields from a business card in the camera view and offers to save to Contacts.
- Product lookup - Point at a consumer product to see it listed in Google Shopping or Amazon.
- Restaurant and venue info - Point at a storefront to see hours, ratings, menu links, and directions from Yelp and Google Maps data.
- QR and barcode reading - Reads any standard barcode or QR code without a separate scanner app.
- Math problem solving - Point at a handwritten or printed math equation to see a step-by-step solution. Works with algebra, geometry, and basic calculus problems.
- Text action - Phone numbers, email addresses, and URLs visible in the camera view are tappable directly.
Visual Intelligence uses on-device processing for initial analysis and routes to Google Search or ChatGPT for the lookup component depending on the query type. iPhone 16 Pro models can access a version of Visual Intelligence through a long-press on the volume buttons in the camera app.
Notification Priority and Summaries
- Priority Notifications - Time-sensitive notifications surface at the top of the stack with a distinct label. The model looks for signals including financial alerts, direct questions from known contacts, travel updates, and calendar conflicts.
- Notification Summaries - Multiple notifications from the same app are collapsed into a single entry with an AI-generated summary. A cluster of group iMessage notifications becomes one line: "Sarah asks about Thursday dinner, Jake can't make it."
Notification summaries generated controversy in late 2024 when the system produced inaccurate summaries of news alerts. Apple has issued model updates in iOS 26.2, 18.3, and 18.4 addressing this. Accuracy on news content has improved. Individual app summaries can be toggled in Settings > Notifications.
Genmoji
Genmoji generates custom emoji from text descriptions in any iMessage conversation. Open the emoji keyboard, tap the Genmoji star icon, and type a description. "A corgi wearing sunglasses in space," "a raccoon in a graduation cap," or a person's name to generate an emoji based on their photos in your Contacts. Results are animated sticker-style images. Genmoji runs entirely on-device and works without a connection.
Image Playground
Image Playground generates illustrative images from text prompts in three visual styles: Animation (flat, vector-style), Illustration (painterly, editorial style), and Sketch (pencil or ink-style line art). It explicitly does not generate photorealistic images - an intentional design constraint. Image Playground is accessible as a standalone app, inside iMessage, and as a panel within Notes. Generation runs on Private Cloud Compute and requires an internet connection; generation takes typically 5-10 seconds.
ChatGPT Integration
When Apple Intelligence's on-device model lacks the capability for a request - detailed research, complex reasoning, specialized domain knowledge - Siri can hand off to ChatGPT. The handoff is strictly opt-in: a clear confirmation prompt appears before any request is sent to OpenAI's servers. As of iOS 26.x, the integration uses ChatGPT-4o. The integration can be disabled in Settings > Apple Intelligence & Siri > ChatGPT.
With Gemini integration coming in iOS 27, Apple will offer two third-party model handoffs. The architecture suggests the system is designed to route different request types to different models based on capability fit.
Smart Reply in Messages
- Contextual reply suggestions - Three suggested replies appear above the keyboard when viewing a message thread. These reference the full thread context and propose complete replies: "Yes, 3pm on Wednesday works for me" rather than single words.
- Thread summaries - Long iMessage threads can be summarized by tapping a summary option at the top of the conversation, showing key points from recent unread messages.
- Writing Tools in compose - The full Writing Tools menu is available when drafting a message, identical to its behavior everywhere else.
What's Still Missing or Underdelivered
- Multi-step agentic Siri - The ability for Siri to chain actions across apps is in the iOS 27 beta but not in the stable iOS 26.x release. Siri can take individual App Intents actions, but not sequences without prompting at each step.
- Non-English language parity - Apple Intelligence in French, German, Spanish, Japanese, and Korean launched in iOS 26.2, but feature depth lags English. Notification summaries in non-English languages produce lower quality outputs.
- Image editing - Apple Intelligence has no generative photo editing beyond Clean Up and Memory Movies. There is no equivalent to Samsung's Generative Edit or Google's Reimagine for modifying photo content with text prompts. This is expected to change with iOS 19.
iOS 27: What's Coming in September 2026
The iOS 27 public beta as of early 2026 shows several confirmed Apple Intelligence additions:
- Gemini-powered Siri - Integration with Google's Gemini model to significantly expand Siri's reasoning and knowledge capabilities for complex, multi-part queries.
- Proactive Siri context - Siri surfaces suggested actions before you ask. A flight confirmation in Mail triggers a prompt to add the trip to Calendar and Wallet. A message asking for your address triggers an offer to share your location from Maps.
- Match My Style in Writing Tools - A new Writing Tools option that learns from your writing samples and applies your personal voice to rewrites, rather than defaulting to Apple's generic tone.
- Generative photo editing - Pixel-style background replacement and object generation, built on Image Playground's models but applied within the Photos editing workflow.
- Visual Intelligence expansion - Additional identification categories, reportedly including medications, circuit board components, and plant disease identification.
- Larger Private Cloud Compute models - Apple has been expanding PCC infrastructure, expected to enable more complex tasks currently requiring ChatGPT handoff to be handled within Apple's own system.
Sources
Frequently Asked Questions
Do I need to enable Apple Intelligence, or is it on by default?
Apple Intelligence is opt-in. A setup prompt appears on first use of a supported device or when updating to iOS 26. You enable or disable individual features in Settings > Apple Intelligence & Siri. Notification summaries can be toggled per-app in Settings > Notifications.
Does Apple Intelligence work in every language?
As of iOS 26.x, supported languages include English (US, UK, Australia, Canada, India), French, German, Italian, Spanish (US and Spain), Japanese, Korean, and Simplified Chinese. Portuguese and Dutch are in the iOS 27 beta. Feature availability within each language varies - Genmoji and Image Playground launched in additional languages later than Writing Tools and notification summaries.
Can Apple see my writing when I use Writing Tools?
Standard Writing Tools requests process on-device and never leave your iPhone. Tasks routed to Private Cloud Compute are processed under an architecture where Apple has stated servers cannot see the content of your requests in a form linkable to your identity. Apple has published the PCC server software for third-party security auditing. ChatGPT integration uses an anonymous session by default.
Does using Apple Intelligence drain the battery faster?
Sustained AI workloads like generating a Memory Movie or processing a long document for summary consume noticeable power. Passive features like notification summaries run as lightweight background processes with minimal battery impact. Heavy Apple Intelligence use adds roughly 5-8% additional battery drain over a full workday compared to a day without the features active, on iPhone 17.
What is the difference between the iPhone 17 and iPhone 17 Pro for AI?
Both run Apple Intelligence at full capability. The A19 Pro in the iPhone 17 Pro and Pro Max has marginally more Neural Engine throughput for sustained AI workloads, but in everyday use the differences are imperceptible. The Pro models also add ProMotion (1-120Hz adaptive display), a more advanced camera system, and larger RAM allocation. For AI features specifically, the standard iPhone 17 at $799 delivers the same day-to-day experience as the Pro at $1,099.
How much storage does Apple Intelligence use?
The on-device model requires approximately 1.5-2 GB of storage. It downloads automatically after opting in on a supported device. iOS will temporarily offload it if storage is critically low, but it re-downloads the next time an Apple Intelligence feature is triggered. There is no option to delete individual model components while keeping the feature enabled.
