Apple has quietly created its own ChatGPT-like chatbot, which is not yet available to the general public. According to a Bloomberg report by Mark Gurman, the tech company is utilizing this tool internally to revamp Siri. Codenamed Veritas, this chatbot serves as a testing platform for Apple engineers. Rather than being aimed at consumers, Veritas enables the company’s software teams to experiment with advanced Siri features. These enhancements include improved contextual understanding, the capability to manage tasks across various applications, and a more intelligent approach to using personal information such as emails, music preferences, and app settings. Essentially, it upgrades Siri to have a better memory, deeper integration, and a more proactive approach to handling user requests.
By rigorously testing features internally, Apple seeks to ensure the assistant meets its renowned high standards before being made available to millions of iPhones globally. Veritas reportedly reflects the design of popular chatbot services, facilitating multiple simultaneous conversations, retaining memory of previous interactions, and supporting longer, more natural dialogues. Its foundation is based on a new Apple framework, called Linwood, which integrates Apple’s proprietary large language models with AI technologies from third-party partners. Apple’s ambitions extend beyond just the chatbot interface. The company initially planned to launch a revamped “Apple Intelligence Siri” with iOS 18, but after the features did not meet performance expectations, it abandoned that iteration and shifted towards a more sophisticated architecture.
This “second-generation” strategy relies heavily on large language models (LLMs), which are the same AI systems that power OpenAI’s ChatGPT, Anthropic’s Claude, and Google’s Gemini. This shift indicates a significant change in direction. Instead of merely handling timers, reminders, or trivia, future Siri will engage in ongoing, natural conversations and manage complex, multi-step requests. Ultimately, Apple aims to transform Siri from a reactive voice assistant into a true conversational partner. Nonetheless, users will need to be patient. Reports indicate that Apple plans to launch the LLM-powered Siri in early 2026, likely coinciding with the iOS 26.4 update in March. This timeline is a year later than initially planned, emphasizing Apple’s careful approach to deploying unfinished AI features.
Given the scale of hundreds of millions of active devices, the stakes are considerable. Before that, Apple may showcase a redesigned appearance for Siri toward the end of 2025. This redesign, described as more “humanoid,” could visually represent the assistant’s advancement into a new era, similar to how the Finder logo has long represented the Mac. Interestingly, Apple is not dismissing potential collaborations. The company has reportedly engaged in talks with OpenAI, Anthropic, and Google about incorporating third-party AI models into Siri. If these discussions are fruitful, the 2026 version of Siri may feature a combination of Apple’s in-house systems and external AI, a rare approach for a company that usually prides itself on developing everything independently.
For now, the upgraded Siri is confined to internal testing, available only to Apple’s engineers. However, if all goes as planned, users may soon experience a Siri that not only catches up with but potentially exceeds its AI competitors.