Gemini stepping in to power Apple’s AI feels like a turning point for the iPhone, especially after years of Apple lagging behind in the AI race. After experimenting with stopgap solutions and leaning on ChatGPT as a fallback, Apple has finally placed Google’s Gemini models at the center of its AI strategy. This is not just about making Siri smarter. It is about reshaping how Apple Intelligence works across the system, from everyday interactions to deeper automation.
The immediate benefit for users is obvious. Siri is expected to become more conversational, more context aware, and far less dependent on handing tasks off to third-party chatbots. Instead of hitting walls or asking permission to redirect a query to ChatGPT, Siri should handle complex requests directly. Google has already demonstrated this shift with Gemini’s Personal Intelligence features, and Apple now has the foundation to offer something comparable on iPhone.
What makes this situation slightly ironic is that Apple once marketed the idea of a deeply personal, all-knowing Siri long before Google did. That vision was quietly shelved when the technology failed to mature fast enough. With Gemini now involved, Apple has a second chance to deliver on that promise. Yet while Siri and Apple Intelligence dominate the conversation, there is one overlooked area that desperately needs attention: the Shortcuts app.
Shortcuts remains one of the most powerful yet underappreciated tools on iOS. On paper, it is capable of executing sophisticated multi-step automations that rival desktop workflows. With Apple Intelligence now integrated, it can already perform impressive AI-driven actions. A single tap can analyze what is playing on your screen, identify a movie or show, and tell you where it is streaming. Another shortcut can turn a screenshot into a searchable memory with summaries, tags, and source links saved directly to Notes.
All of this can be powered by on-device models or handled offline, which highlights just how much potential Shortcuts holds. Unfortunately, that potential is buried under a confusing and intimidating creation process. For newcomers, building even a simple automation can feel overwhelming. For experienced users, assembling longer workflows often becomes an exercise in patience.
The underlying scripting system relies heavily on variables, conditional logic, and hidden actions that are not easy to discover. Many users only realize an action exists after digging through app-specific menus or reading long tutorials online. Even after understanding the basics, it is easy to lose track of how data flows through a shortcut once it grows in complexity.
Using ChatGPT to help design shortcuts might seem like a solution, but in practice it often creates more confusion. The instructions tend to be vague, incomplete, or reference actions that do not exist inside the Shortcuts app. This disconnect explains why so many users rely on shared iCloud links rather than building their own automations from scratch.
The frustrating part is that Apple already has most of the pieces required to fix this. Apple Intelligence and App Intents allow Siri to understand what actions apps can perform without opening them. Apple itself explains this in its App Intents documentation on developer.apple.com, noting that Siri can suggest and execute app actions across the system.
Gemini already operates in a similar way inside Google’s ecosystem and even extends to third-party apps like WhatsApp. You can describe a task in natural language and Gemini translates it into actions behind the scenes. Siri can technically do this as well, but the experience has not been unified or exposed properly, especially inside Shortcuts.
Imagine telling Siri to create a shortcut that activates Focus mode at a specific time, silences notifications except for Slack, and delivers a summarized digest of group messages. Today, that scenario remains unrealistic. With a conversational layer powered by Gemini, it should be trivial.
What Shortcuts truly needs is a natural language interface that turns plain descriptions into working automations. Instead of dragging blocks and guessing which variables connect, users should be able to describe what they want and refine it through conversation. Apple could also introduce an agent-style assistant dedicated to creating and editing shortcuts through text or voice.
Other companies are already moving in this direction. Replit recently announced an AI tool capable of generating fully functional mobile apps from a simple description and publishing them directly to the App Store, as reported by Digital Trends. While that approach may be overkill for many users, the underlying concept fits Shortcuts perfectly.
Nothing, the Android manufacturer, has launched Playground, a no-code platform that lets users describe an app and build it directly on their phone. Google has its own equivalent in Opal, which allows users to create web apps using Gemini. These tools lower the barrier to creation by removing traditional development complexity.
Apple could adapt this idea by allowing shortcuts to transform into lightweight app-like experiences. Shortcuts already support sharing and community collaboration through iCloud links. Turning them into mini apps with conversational editing would make them easier to customize and far more accessible.
This approach would also address a major pain point. Many shared shortcuts break when users try to modify them. A conversational interface could explain what each step does, suggest improvements, and safely adjust parameters without breaking the workflow.
There is also a trust advantage here. Instead of relying on third-party automation apps that require subscriptions or cloud processing, users could keep everything inside Apple’s ecosystem. Shortcuts could run fully offline or use Apple’s private cloud compute for AI-heavy tasks, preserving privacy and security.
With Gemini now part of Apple’s AI stack, the opportunity is enormous. If Apple applies the same AI-driven overhaul to Shortcuts that it promises for Siri, the app could evolve from a niche power-user tool into a core feature that defines the iPhone experience. Android has already shown what is possible when AI agents are deeply integrated. Apple now has the tools to deliver something equally transformative, provided it finally gives Shortcuts the attention it deserves.








