
At MacStories, Federico Viticci does some excellent speculation about how Apple might catch up in AI:
Yesterday, Wayne Ma, reporting for The Information, published an outstanding story detailing the internal turmoil at Apple that led to the delay of the highly anticipated Siri AI features last month…. one tidbit in particular stood out to me: Federighi has now given the green light to relying on third-party, open-source LLMs to build the next wave of AI features….
“Using” open-source models from other companies doesn’t necessarily mean shipping consumer features in iOS powered by external LLMs. I’ve seen some people interpret this paragraph as Apple preparing to release a local Siri powered by Llama 4 or DeepSeek, and I think we should pay more attention to that “build the best AI features” (emphasis mine) line.
This is a really smart supposition: That Apple’s AI team may have considered some techniques off limits, possibly to the classic “Not Invented Here” syndrome. But AI development can be weird, and if Federighi’s team has been told that there are no sacred cows, there are ways for Apple to catch up quickly.