By John Gruber
WorkOS: Scalable, secure authentication, trusted by OpenAI, Cursor, Perplexity, and Vercel.
In an item earlier this week observing that Swift Assist, the most ambitious Xcode-related Apple Intelligence feature shown at WWDC last year, not only hasn’t yet shipped but still is not in beta, I wondered whether Apple actually demoed it live last year. John Voorhees, writing for MacStories from WWDC last June, reports that they did:
Earlier today, I got the very first live demo of Swift Assist, one of the many developer tools introduced today by Apple. I also saw code completion in action. It was an impressive demo, and although the tools seem like magic and will undoubtedly be valuable to developers, they do have their limitations, which are worth exploring. [...]
The code completion demo also included a live demo of Swift Assist. Unlike code completion, Swift Assist requires an Internet connection because requests are sent to the cloud. As a result, it takes several seconds for Swift Assist to return results. The delay was noticeable compared to the speed of code completion, but it wasn’t a painfully long wait either.
I heard this week from a third-party developer who was invited to Apple for a one-day hands-on session with Swift Assist late last year. Swift Assist was definitely working, but seemingly not working too well. From that source: “The UI is very much complete (just like Siri), but the results the LLM produces were not very good. It could make very basic demo apps with a prompt like ‘make an app that takes the NASA satellite JSON and shows the current satellites traveling overhead right now’, but not too much more than that. It fell apart on more complex tasks.”
I remember the remote-inference-only aspect of the Swift Assist presentation from my Xcode briefing at WWDC: that because of its complexity, Swift Assist would not execute locally and would only run via Private Cloud Compute. My own notes on this from WWDC were mostly related to the privacy and security implications. That developers should feel safe using Swift Assist even with confidential code and projects because Private Cloud Compute would be guaranteed private. I also remember thinking, at the time, that I should be more skeptical about Apple’s claims about Apple Intelligence features that would execute locally, on-device, rather than the ones that would execute remotely, via Private Cloud Compute, because the way almost all “AI” features from other companies over the previous two years worked was entirely in the cloud. Apple’s statements that Apple Intelligence will perform much inference locally, on-device, seemed like the stretch goal.
But now in March 2025 I’m beginning to think it’s the other way around. What features and aspects of Apple Intelligence run in Private Cloud Compute, today, in March 2025? Do any? I’ve been poking around for a few days and I don’t have any answers. Is Private Cloud Compute running in production yet? How would we know? If you know, let me know.
Update: “How to Generate a Report of Apple Intelligence Requests Sent to Private Cloud Compute”.
★ Saturday, 15 March 2025