By John Gruber
WorkOS, the modern identity platform for B2B SaaS — free up to 1 million MAUs.
Mark Gurman, with yet another scoop:
The new system will allow Siri to take command of all the features within apps for the first time, said the people, who asked not to be identified because the initiative isn’t public. That change required a revamp of Siri’s underlying software using large language models — a core technology behind generative AI — and will be one of the highlights of Apple’s renewed push into AI, they said. [...]
Siri will be a key focus of the WWDC unveiling. The new system will allow the assistant to control and navigate an iPhone or iPad with more precision. That includes being able to open individual documents, moving a note to another folder, sending or deleting an email, opening a particular publication in Apple News, emailing a web link, or even asking the device for a summary of an article.
This sounds a lot like a large action model, not just a large language model. It makes sense if Apple can pull it off.
In 2018, Apple launched Siri Shortcuts as well, letting users manually create commands for app features. The new system will go further, using AI to analyze what people are doing on their devices and automatically enable Siri-controlled features. It will be limited to Apple’s own apps at the beginning, with the company planning to support hundreds of different commands.
This makes me think that developers will need to support new APIs to describe and define the sort of actions Siri can perform — like Siri Shortcuts but richer, and hopefully easier for developers to support. According to Gurman, this feature isn’t slated to roll out to iOS 18 users until sometime next year. That makes sense, given that the ink seemingly isn’t yet dry on the Apple-OpenAI partnership.
Writing at 9to5Mac, Ryan Christoffel puts it thus:
Presumably, this change will lead to a lot fewer occasions of having you ask Siri to complete a task and finding it has no idea what you’re talking about. A more intelligent Siri that can understand natural language for a much wider array of commands sounds like the Siri we have always expected but never quite got.
That sounds like exactly where Apple’s goalposts should be.
★ Thursday, 30 May 2024