Take a look around the artificial intelligence (AI) landscape, and it’s clear that Apple is not among the frontrunners. Although the company announced its own Apple Intelligence offering at its Worldwide Developers Conference (WWDC) in June 2024 — hailing it as “AI for the rest of us” — it’s been beset by delays and missing features in the two years since it was revealed.
With rivals like ChatGPT and Google Gemini pulling further ahead, it seems like it’s harder than ever for Apple to catch up.
Article continues below
Masquerading as Siri
According to Gurman, Apple plans to bring what it calls ‘Extensions’ to Siri at WWDC 2026. These would “let users install third-party AI chatbots beyond ChatGPT and run them inside Siri,” Gurman says. This is separate from the ongoing situation whereby ChatGPT and Google Gemini are being harnessed to power some Apple Intelligence features.
It’s the last bit of Gurman’s quote that has me concerned. Right now, Siri has wide-ranging access to the contents of your iPhone. The list of tasks it can currently fulfil (or will soon be able to, when the new Apple Intelligence-powered version lands) is extensive and includes looking up your contacts, knowing places you’ve recently visited, recognizing people in your photos, and more. Apple has been happy to allow that because it controls the process from end to end.
Siri has enough problems as it is. Is it worth creating more?
But what if Apple opens up this kind of sensitive information to third-party chatbots? Will your information be ingested into the unknowable algorithms and be used to train them, thus presenting a not-insignificant risk of data leaks?
The very fact that AI algorithms are a black box means we just don’t know what could happen to your information. You’d hope that Apple would put stringent restrictions on what the AI tools can do, but do we know that those guardrails will be foolproof? I doubt it.
And then there’s the fact that all this will be going on inside a Siri-shaped wrapper (as Gurman says, Apple’s plan is to run these chatbots “inside Siri”).
If you interact with ChatGPT when it is masquerading as Siri, my feeling is that you’re more likely to let your guard down and feed it more sensitive information because you’re used to Siri and you know it comes from a privacy-focused company like Apple. Yet in reality, you’ll be using a third-party tool, not Apple’s locked-down chatbot.
Without some clear signposting that an external AI tool is being used, that could have potential consequences for your data privacy, as well as for a company like Apple, which usually positions itself as working so hard to protect that very privacy.
Losing control
That’s not the only risky upshot I can foresee for this project. It feels to me that there’s a good chance that Apple could end up losing control of Siri, and that could have consequences for everyone involved.
For instance, when or if something goes wrong with one of these Siri-wrapped chatbots, who are users going to blame? At least with the standalone ChatGPT app, users know that it’s an external product. That allows the blame to be properly allocated and also permits Apple to wash its hands of the problem.
But what if you have a seriously negative experience with Siri when it’s powered by a third-party AI tool? Are you going to necessarily know that it’s not Siri that caused the issue, but the underlying chatbot? If Google Gemini is embedded within Siri, and Siri is the interface you interact with, you’re probably going to lay the blame squarely at Apple’s feet. After all, it’s that familiar Siri experience that you’re getting, even if the guts are different.
That means there’s a good chance of negative blowback coming Apple’s way, whether or not the company is actually at fault. Siri has enough problems as it is. Is it worth creating more?
Apple is a firm obsessed with control. It feels hard to imagine it’s worth the risk of jeopardizing that control to carve out an increasingly small AI niche for itself.
Gurman added a further detail: Siri Extensions will have their own “dedicated App Store section, effectively creating an AI App Store.” Users will be able to download and install Extensions directly from this trusted place, then have them integrate right into Siri.
On the face of it, this sounds like a positive development. Apple’s App Store review process is conducted by humans and is generally fairly reliable, despite the occasional slip-up.
But we already know that Apple is struggling with AI apps that are submitted to the App Store. There have been multiple reports of delays to App Store approval as Apple’s reviewers seem overwhelmed by the number of fully vibe-coded apps that have been submitted. After all, if AI can write your entire app in a matter of days or even hours, the quantity of submitted apps is going to increase substantially as the accessibility of creating and publishing an app goes up, and the cost of doing so goes down.
Apple has also blocked several vibe coding apps from submitting updates to the App Store, citing store rules concerning apps that change their code or purpose. Vibe coding apps fall foul of this rule because they allow users to generate code on the fly. Since Apple isn’t able to review code that’s added after the app was submitted, it has no way of knowing what could be created with a vibe coding app.
Now imagine a reality in which Apple’s already overworked App Store review team is confronted with an entire sub-section dedicated to AI add-ons for Siri. How is the company going to guarantee the security of its users’ devices in a situation like this? And how is it going to guarantee a timely submission and approval process if it’s adding an extra sub-section to its already bustling App Store?
Any failures or dangerous Extensions that get through will not only potentially damage users’ devices but will also generate bad press for Apple. The company won’t want either of those things, especially when they’re linked to its much-discussed Apple Intelligence and Siri features.
Apple is a firm obsessed with control. It feels hard to imagine it’s worth the risk of jeopardizing that control to carve out an increasingly small AI niche for itself, even as Apple has seemingly given up hope of becoming a leader in the field. Yet that seems to be the risk of rushing ahead with this chatbot plan — and the already much-maligned Siri could be hit with the fallout.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.