Apple plans to turn Siri into a full AI chatbot to challenge ChatGPT and Gemini

Image Credit: Omid Armin / Unsplash

Apple is preparing a major transformation of Siri that could redefine how users interact with its devices. According to recent reports, the company plans to evolve Siri into a fully conversational AI chatbot later this year as it works to close the gap with generative AI leaders such as OpenAI and Google.

The revamped assistant is expected to behave far more like modern AI chatbots, moving beyond short commands and scripted replies. Internally, the new Siri project is reportedly codenamed Campos and is designed to replace the existing Siri interface rather than sit alongside it.

Apple’s renewed push follows growing pressure in the AI space, where conversational assistants like ChatGPT and Google Gemini have set new expectations for how naturally users can interact with software.

A shift toward true conversation

While users would still activate Siri using familiar triggers such as voice commands or hardware buttons, the experience itself is expected to feel fundamentally different. Instead of issuing one-off requests, users may be able to hold ongoing conversations, ask follow-up questions, and switch seamlessly between voice and text input.

This change would mark a clear departure from the current version of Siri, which has often been criticized for its limited understanding and rigid responses. By adopting a chatbot-style interaction model, Apple aims to make Siri feel more responsive, adaptive, and useful across a wider range of everyday tasks.

Reports suggest that the new system will support more complex queries, contextual awareness, and longer interactions that feel closer to natural human dialogue.

Deep integration across Apple platforms

Unlike standalone AI apps, the next-generation Siri is expected to be built directly into Apple’s operating systems. According to reporting from Bloomberg, the chatbot-style Siri will be deeply integrated into iOS 27, iPadOS 27, and macOS 27.

Rather than launching as a separate application, Siri would function as a core layer across Apple devices, enabling tighter connections with system features and first-party apps. This approach could allow users to interact with files, messages, photos, and apps using conversational language instead of navigating menus or settings manually.

The assistant is also expected to handle tasks such as web searches, content creation, image generation, summarizing information, and analyzing uploaded documents. These capabilities would bring Siri closer to the feature set already offered by leading AI chatbots.

Smarter actions inside everyday apps

One of the most notable changes involves how Siri could interact with Apple’s own apps. The upgraded assistant is reportedly being designed to understand context across Mail, Calendar, Photos, Files, and Messages.

This could allow users to draft emails based on calendar events, edit photos using voice instructions, or locate specific files and conversations with simple natural language requests. By embedding AI capabilities directly into core apps, Apple appears to be focusing on practical, device-level intelligence rather than novelty features.

The goal is to make Siri a more proactive and capable assistant that reduces friction across daily workflows.

Gemini models expected to power the upgrade

To accelerate this transformation, Apple is widely reported to be using AI models developed by Google. Earlier reports indicate that the intelligence behind Siri’s overhaul will rely heavily on Gemini technology, at least in the near term.

This partnership highlights how seriously Apple is taking the challenge of generative AI, even if it means leaning on external models while it continues developing its own solutions. A more limited update to Siri and Apple Intelligence is expected to arrive first with iOS 26.4, maintaining the current interface but improving performance and understanding using Gemini-based models.

That update is seen as a stepping stone toward the more radical redesign planned later in the year.

WWDC reveal and rollout timeline

The chatbot version of Siri is expected to make its public debut at Apple’s Worldwide Developers Conference in June. If unveiled as anticipated, it would become one of the headline features of iOS 27 and macOS 27.

A broader rollout is likely to follow in the fall, aligning with Apple’s usual software release cycle. While Apple has not officially commented on the reports, the scale and scope of the changes suggest a significant strategic shift.

As competitors continue to embed conversational AI deeply into their platforms, Apple appears ready to reimagine Siri from the ground up, positioning it as a central interface for interacting with the entire Apple ecosystem.

Facebook
Twitter
Pinterest
Reddit
Telegram