By John Gruber
WorkOS: Scalable, secure authentication, trusted by OpenAI, Cursor, Perplexity, and Vercel.
How has your week been? My week was ... busy. That includes a new episode of The Talk Show recorded yesterday, dropping in your favorite podcast app soon. Amidst all the writing (and talking) I’ve been doing, I’m also working on filling up open weeks on the sponsorship schedule for Q2.
After a very full February and March, I’ve got a bunch of openings in the next few months — and openings for the next two weeks, starting with this Monday.
Weekly sponsorships have been the top source of revenue for Daring Fireball ever since I started selling them back in 2007. They’ve succeeded, I think, because they make everyone happy. They generate good money. There’s only one sponsor per week and the sponsors are always relevant to at least some sizable portion of the DF audience, so you, the reader, are never annoyed and hopefully often intrigued by them. And, from the sponsors’ perspective, they work. My favorite thing about them is how many sponsors return for subsequent weeks after seeing the results.
If you’ve got a product or service you think would be of interest to DF’s audience of people obsessed with high quality and good design, get in touch. And again, as I type this, the next two weeks remain open.
My thanks to WorkOS for sponsoring DF, once again, this last week. This has been WorkOS’s Launch Week, and they’ve got a slew of new features to show. Honestly, though, you should check out their Launch Week page just to look at it — it’s beautiful, fun retro-modern pixel-art goodness. Great typography too. I wish every website looked even half this cool.
New features launched just this week include:
Ookla, the company behind the Speedtest download/upload bandwidth testing app:
Although it’s early in the adoption curve for the iPhone 16e, we analyzed the performance of the new device from March 1st through March 12th, and compared it to the performance of iPhone 16, which has a similar design and the same 6.1” screen. Both devices run on the same Apple-designed A18 SoC.
When we compare Speedtest Intelligence data from the top 90th percentile (those with the highest performance experience) of iPhone 16e and iPhone 16 users from all three of the top U.S. operators, we see the iPhone 16 performing better in download speeds. However, at the opposite end, with the 10th percentile of users (those who experience the lowest performance) we see the iPhone 16e performing better than the iPhone 16.
There are some differences, but overall the 16e’s cellular performance seems great for the spectrums it supports. And given the efficiency claims from Apple, it might be the better overall modem.
My number one tip for becoming a Mac power user is to get into Keyboard Maestro. Using Keyboard Maestro feels like gaining superpowers. I keep meaning to write more about Keyboard Maestro, and so I’m just going to start documenting all the little use cases I find for it. Here’s one from today.
I use MarsEdit to publish at least 99 percent of the posts on this site. (The other 1 percent are posts I create on my phone, using the web interface for Movable Type.) I use MarsEdit a lot. About once a week or so, I accidentally try to paste text in MarsEdit when I think I have text on my clipboard, but it’s actually an image. When you paste an image in MarsEdit, it’s not like pasting into Mail or Notes or TextEdit, where the image just goes into the text. So MarsEdit, trying to be helpful, opens its Upload Utility window — which, if I were using WordPress or some other CMS, might allow to me upload the image to my server for referencing from the HTML of the blog post. That’s not how my system works, and not how I want it to work, so every time this happens I have to close the Upload Utility window. And every time, I try to do this by hitting the Esc key on my keyboard. But the Upload Utility window isn’t a dialog box with a Cancel button that would be triggered by Esc. It’s a regular window. So after hitting the Esc key, which doesn’t do anything in this context, I then remember, once again, that I need to hit ⌘W instead. (I think I don’t naturally think to hit ⌘W because my instincts tell me ⌘W would try to close the blog window I’m writing in.)
Today it happened again, and finally the notion occurred to me that I could fix this with Keyboard Maestro. My first thought was that I could create a macro that would close the frontmost window in MarsEdit if, and only if, the frontmost window was named “Upload Utility”. A second later it occurred to me that I could probably do better than that, and prevent the Upload Utility window from opening in the first place if I ever try to paste an image in MarsEdit.
I was right. This wasn’t just super easy to create in Keyboard Maestro, it was super quick. I’ve spent 10× more time writing about this macro here than I did creating it. I think that’s why I so seldom write about my little hacks in Keyboard Maestro — they not only save me time and eliminate annoyances once they’re created, but they’re so easy to create that I just get back to whatever I was previously doing after making a new one.
First, I have a group (think: folders) in Keyboard Maestro for every app for which I’ve created app-specific macros. You just create a new group and set it to only be available when one (or more) specific applications are active. Inside my group for MarsEdit, I created a new macro named “Don’t Paste Images”.
It’s triggered by the hot key sequence ⌘V. That means every single time I paste in MarsEdit, this macro will run. Keyboard Maestro is so frigging fast that I’ll never notice. (Keyboard Maestro macros execute so fast that in some scenarios, you have to add steps to pause for, say, 0.2 seconds to keep the macro from getting ahead of the user interface it’s manipulating.)
The macro executes a simple if-then-else action with the following pseudocode logic:
if the System Clipboard has an image
play a sound
else
simulate the keystroke ⌘V
That’s the whole thing. And it worked perfectly the first time I tried it. Here’s a screenshot of my macro.
So if I type ⌘V in MarsEdit, and the clipboard contains an image, I just hear a beep. (I could just default to the system beep, but I chose the standard MacOS “Bottle” sound just for this macro — I sort of want to know that it’s this macro keeping me from pasting whatever text I wrongly thought was on my clipboard, so I want a distinctive sound to play.) Nothing gets pasted, so MarsEdit’s Upload Utility window doesn’t appear.
If the clipboard doesn’t contain an image, then Keyboard Maestro simulates a ⌘V shortcut and that gets passed to MarsEdit, and from my perspective as a user, it’s just like a normal paste of the text I expected. I have a few macros that work like this, where the macro is trigged by an application’s own keyboard shortcut, and the macro will (if certain conditions are met) pass through the same simulated keyboard shortcut to the application. When I first tried this, many years ago, I was half worried that it would trigger an infinite loop, where the simulated keystroke from the Keyboard Maestro macro would re-trigger the macro. I was wrong to worry — Keyboard Maestro is too clever for that.
You almost certainly don’t have my particular problem with the occasional inadvertent pasting of images into MarsEdit. But I bet you have your own esoteric annoyances related to your own most-used apps and most-frequent tasks. Keyboard Maestro lets you effectively add your own little features to your favorite apps — often with no “scripting” at all. The best part is, while writing this very blog post, my new “Don’t Paste Images” macro saved me from seeing that cursed Upload Utility window once more, because I had the screenshot of the macro on my clipboard, when I thought I had copied the URL for it on my server. ★
Emma Roth, The Verge:
TechCrunch has a new owner, again. Yahoo has sold the tech news site to the private equity firm Regent for an undisclosed sum, according to an announcement on Friday.
Regent is the same company that snapped up Foundry, the firm behind outlets like PCWorld, Macworld, and TechAdvisor on Thursday. Founded in 2005, TechCrunch has experienced many shakeups in ownership after AOL acquired the site in 2010.
A lot of shakeups in a lot of media companies’ ownership lately. Steady as she goes here at The Daring Fireball Company, a subsidiary of Fedora World Media Industries.
Matthew Belloni has a very good take on Apple TV+ at Puck (that’s a gift link that should get you through their paywall — but which requires you creating a free account, sorry):
All of which fed into the self-centered fears of my lunch date. What, if anything, does the current state of Apple mean for its entertainment business? After all, more than five years into the Apple TV+ experiment, it’s never been entirely clear what C.E.O. Tim Cook and services chief Eddy Cue are up to in Hollywood. Certainly not making money, at least not in the traditional sense. The Information reported today that Apple lost $1 billion on Apple TV+ last year, following a Bloomberg report that more than $20 billion has been shoveled into making original shows and movies since 2019. That’s not nothing, even for a company worth $3 trillion.
The “loss” number is a bit misleading, of course, considering Apple has always said that a key goal is to leverage Leo DiCaprio and Reese Witherspoon to thicken its brand halo and the device “ecosystem,” ultimately boosting its other businesses. But still… for all its billions, Apple TV+ has accumulated only about 45 million subscribers worldwide, according to today’s Information report and other estimates.
That’s far less than Disney+, Max, and Paramount+, all of which launched around the same time. Those rival services are attached to legacy studios with rich libraries, but they’re not attached to a company with $65 billion in cash on hand and a device in the pockets of 1 billion people that also delivers bundle-friendly music, news, and games. Apple declined to confirm or comment on any numbers, but a source there suggested the subscriber number is higher than 45 million and that the global nature of the sub base is being undercounted by U.S.-oriented research firms. Maybe. The company reveals zero performance data beyond B.S. “biggest weekend ever!” press releases that the trades accept without skepticism and producers like Ben Stiller and David Ellison post with “blessed” emojis on their social media. No one outside the company really knows how the Apple TV+ business is performing.
One interesting nugget is this chart, which suggests that subscriptions to TV+ have boomed since Apple and Amazon worked out a deal to sell TV+ subscriptions through Amazon Channels in Prime Video at the end of last year. That deal has, seemingly, moved the needle. Another interesting nugget is that TV+ seems to suffer from a higher churn rate than other streaming services. Said Belloni’s Puck colleague Julia Alexander, “Fewer than 35 percent of all subscribers keep the service for longer than six months.”
That’s kind of crazy. I’d think TV+ would have less churn, not more, than the industry average — that the Apple TV+ audience is small but loyal. Perhaps this is the unsurprising side effect of Apple giving away 3-month trials when you purchase new devices. But I also truly wonder if TV+ subscriptions are the hardest for industry groups to measure, because so many people who do subscribe watch through tvOS (or, on their phones, on iOS) where everything is private. Belloni hints at this, and says little birdies at Apple told him the TV+ subscriber base is larger than they’re getting credit for.
And how do you count Apple One subscribers toward TV+’s subscriber base? My vague theory about Cue and Cook’s thinking about getting into this business has been about making it one leg among several on the stool of reasons to subscribe to Apple One. That Apple will take subscribers who are only subscribed to TV+, or only subscribed to TV+ and Apple Music, but what they really want is to get people to subscribe to Apple One, which because it includes iCloud storage, almost certainly has very little churn.
Belloni closes thus:
Apple wouldn’t be the first tech powerhouse to dabble in professionally produced content only to retreat. [...] Neither Cook nor Cue has suggested anything like that, and Apple, in just over five years, has become a reliable partner and a high-quality buyer for Hollywood shows and movies. In some ways, it’s remarkable how fast Apple TV+ became part of the entertainment community. Whether that lasts is the question.
Here’s where I will point out that Apple isn’t like other tech companies. Apple isn’t a move fast and break things company. They’re a measure twice, cut once company. When they commit to something, they tend to stay committed. And they’re very, very good at playing long games that require patience, especially when entering new markets. Look at Apple Pay. 10 years ago, it was widely panned as a flop after a slow first year. Now it’s everywhere.
Jill Goldsmith, Deadline:
Apple is losing more than $1 billion a year on streamer Apple TV+, according to a report in the Information that cited two people familiar with the matter. The tech giant has spent over $5 billion a year on content since launching Apple TV+ in 2019 but trimmed that by about $500 million last year, the report said.
The headline on Wayne Ma’s report at The Information set the framework: “Apple Streaming Losses Top $1 Billion a Year” — the story got picked up widely, and almost everyone who did framed it in terms of losing or a loss. But is it a loss when Apple expected the business to be unprofitable for a decade or more? From Scharon Harding’s paraphrasing at Ars Technica of Ma’s paywalled report:
Apple TV+ being Apple’s only service not turning a profit isn’t good, but it’s also expected. Like other streaming services, Apple TV+ wasn’t expected to be profitable until years after its launch. An Apple TV+ employee that The Information said reviewed the streaming service’s business plan said Apple TV+ is expected to lose $15 billion to $20 billion during its first 10 years.
For comparison, Disney’s direct-to-consumer streaming business had operating losses of $11.4 billion between the launch of Disney+ in fall 2020 and April 2024. Disney’s streaming business became profitable for the first time in its fiscal quarter ending on June 29, 2024.
The above two paragraphs of essential context are buried 13 paragraphs down. If Apple expected TV+ to operate in the red, to the tune of $15–20 billion over its first decade, and halfway through that decade (TV+ debuted in November 2019) it operated in the red to the tune of $1 billion for the year — doesn’t that mean costs are exactly in line with their expectations?
The insinuation here is that Apple’s pissing this money away and doesn’t know what they’re doing. Maybe they are! But if so it was exactly Eddy Cue and Tim Cook’s strategy to piss this money away. If Apple had expected TV+ to be profitable or break-even in 2024, then a $1 billion operating loss would be a story. But as it stands it’s just a cost. How much did Apple “lose” on electricity bills last year?
Juli Clover, writing for MacRumors last week:
With new iOS software updates, Apple has been automatically turning Apple Intelligence on again even for users who have disabled it, a decision that has become increasingly frustrating for those that don’t want to use Apple Intelligence .
After installing iOS 18.3.2, iPhone users have noticed that Apple Intelligence is automatically turned on, regardless of whether it was turned off prior to the update being installed. There is an Apple Intelligence splash screen that comes up after updating, and there is no option other than tapping “Continue,” which turns on Apple Intelligence .
If you’ve updated to iOS 18.3.2 and do not want Apple Intelligence enabled, you will need to go the Settings app, tap on Apple Intelligence, and then toggle it off. When Apple Intelligence is enabled, it consumes up to 7GB of storage space for local AI models, which is an inconvenience when storage space is limited.
I’d been seeing complaints about this, including from some friends who are developers and/or had previously worked on iOS as engineers at Apple. A bunch of regular DF readers have written to complain about it too. I wouldn’t call it a deluge, but I’ve gotten an unusual number of complaints about this. (And at CNet, Jeff Carlson reports the same thing happening with MacOS 15.3.2.)
I hadn’t experienced it personally because I have Apple Intelligence enabled on my iPhone. But my year-old iPhone 15 Pro was still running iOS 18.2. So I disabled Apple Intelligence on that phone, then updated it to 18.3.2. When it finished, Apple Intelligence was re-enabled. I also tried this on my iPhone 16e review unit, which was still running iOS 18.3.1 (albeit a version of 18.3.1 with a unique build number for the 16e). I turned Apple Intelligence off, upgraded to 18.3.2, and on that iPhone, Apple Intelligence remained off after the software upgrade completed.
So I don’t know if this is a bug that only affects some iPhones, or a deliberate growth hacking decision from Apple to keep turning this back on for people who have explicitly turned it off. But it’s definitely happening.
And while the 7 GB of storage space required for the model is a legitimate technical reason to turn it off, I think (judging from my email from DF readers) the main reason people disable Apple Intelligence is that they don’t like it, don’t trust it, and to some degree object to it. It could take up no additional storage space at all and they’d still want it disabled on their devices, and they are fucking angry that Apple’s own software updates keep turning it back on. Put aside the quality or utility of Apple Intelligence as it stands today, and there are people who object to the whole thing on principle or, I don’t know, just vibes alone. Feelings are strong about this. Turning it back on automatically, after a user had turned it off manually, leads those users to correctly distrust Apple Intelligence specifically and Apple in general.
If it’s a bug, it’s a bug that makes Apple look like a bunch of gross shysters. If it’s not a bug, it means Apple is a bunch of gross shysters. I’d wager on bug — especially after seeing it not happen on my 16e review unit. I’m thinking it’s something where it’s supposed to be enabled by default, once, for people who’ve never explicitly turned Apple Intelligence on or off previously, but that for some devices where it has been turned off explicitly, somehow the software update is mistaking it for the setting never having been touched. Apple needs to get it together on this one.
Ina Fried, reporting for Axios:
The suit, filed Wednesday in U.S. District Court in San Jose, seeks class action status and unspecified financial damages on behalf of those who purchased Apple Intelligence-capable iPhones and other devices.
“Apple’s advertisements saturated the internet, television, and other airwaves to cultivate a clear and reasonable consumer expectation that these transformative features would be available upon the iPhone’s release,” the suit reads. “This drove unprecedented excitement in the market, even for Apple, as the company knew it would, and as part of Apple’s ongoing effort to convince consumers to upgrade at a premium price and to distinguish itself from competitors deemed to be winning the AI-arms race.”
Most of these class action lawsuits are bullshit, but it’s hard to argue with the basic premise of this one.
This is beautiful and crazy, and no, I’m not going to buy one, but damn I’m tempted and I’d sure like to try one. I’m glad it exists.
Mark Gurman, with a blockbuster scoop for Bloomberg:
Apple Inc. is undergoing a rare shake-up of its executive ranks, aiming to get its artificial intelligence efforts back on track after months of delays and stumbles, according to people familiar with the situation.
Chief Executive Officer Tim Cook has lost confidence in the ability of AI head John Giannandrea to execute on product development, so he’s moving over another top executive to help: Vision Pro creator Mike Rockwell. In a new role, Rockwell will be in charge of the Siri virtual assistant, according to the people, who asked not to be identified because the moves haven’t been announced.
Rockwell will report to software chief Craig Federighi, removing Siri completely from Giannandrea’s command. Apple is poised to announce the changes to employees this week. The iPhone maker’s senior leaders — a group known as the Top 100 — just met at a secretive, annual offsite gathering to discuss the future of the company. Its AI efforts were a key talking point at the summit, Bloomberg News has reported. [...]
My quick take on this is that it’s a turf battle that Craig Federighi just won. It’s not just putting a new executive in charge of Siri, it’s moving Siri under Federighi’s group.
How Gurman got this scoop before Apple had announced the changes — even internally — is rather unbelievable. It’s not “Bloomberg” that got this scoop. It’s Mark Gurman. And trust me, Apple PR did not leak this to him deliberately. I’m sure they’re now accelerating an announcement, at least internally, framing it on their own terms. I can only guess that Gurman hinted at his sourcing in the passage above: Tim Cook must have announced these changes at the Top 100 retreat this week, and at least two of those attendees leaked the news to Gurman. Unprecedented.
Also:
Rockwell is currently the vice president in charge of the Vision Products Group, or VPG, the division that developed Apple’s headset. As part of the changes, he’ll be leaving that team and handing the reins to Paul Meade, an executive who has run hardware engineering for the Vision Pro under Rockwell.
I don’t find it surprising at all that Rockwell was given this task.
Giannandrea will remain at the company, even with Rockwell taking over Siri. An abrupt departure would signal publicly that the AI efforts have been tumultuous — something Apple is reluctant to acknowledge. Giannandrea’s other responsibilities include oversight of research, testing and technologies related to AI. The company also has a team reporting to Giannandrea investigating robotics.
This I find a little surprising. But maybe I shouldn’t. I don’t buy Gurman’s argument that dismissing Giannandrea would “signal publicly that the AI efforts have been tumultuous”. Apple already signaled that publicly when they announced that all of the ambitious features for Siri and Apple Intelligence that were promised for this year’s OS cycle would be postponed until next year’s OS cycle. That’s public tumult. But I mean, you can see for yourself that Apple’s AI efforts have been “tumultuous” by asking Siri on your iPhone, right now, what month it is.
What Apple needs to signal is that they don’t expect to deliver a significantly better Siri without making significant changes to the team behind Siri.
But maybe the answer is as simple as that Giannandrea is good at leading and managing teams doing advanced research that is abstracted from product. So move the products out of his division and into Federighi’s, and put someone who knows how to ship directly in charge of Siri. Leave Giannandrea in charge of a division focused on research and technology. Attention has moved on from “machine learning” to LLMs, but Apple’s machine learning game has gotten very good.
Here’s an update I just appended to my post yesterday, after linking to Gus Mueller’s suggestion that Apple open up a semantic index to third-party AI apps:
HealthKit already works a lot like what Mueller is suggesting here (for, say, “SemanticKit”). With explicit user permission — that can be revoked at any time — third party apps can both read from and write to your Health data. Apple does a lot of that itself, both through Apple Watch and from the various activity-related things an iPhone can track, but third-party apps and devices are welcome participants, in a private, easily-understood way.
Nobody is suggesting Apple should give up on AI. Quite the opposite. They really need to go from being a joke to being good at it, fast. But there’s no reason at all they should build out a strategy that relies on Apple doing all of it themselves, and Apple users relying solely on Apple’s own AI. Do it like Health — a model that has proven to be:
(Thanks to Bill Welense for the suggestion.)
Last March, when Apple introduced the then new M3 MacBook Airs, they moved the base model 13-inch M2 MacBook Air into the magic $999 spot in their own lineup, replacing the M1 MacBook Air. But mid-March it was announced that Walmart would begin selling the M1 MacBook Air — in one tech-spec configuration (8 GB RAM, 256 SSD), but three colors (gold, silver, space gray) for just $700.
This year Apple replaced the entire lineup of MacBook Airs that it sells itself with M4-based models, including the $999 starting-price model. Online, Walmart sells a handful of MacBook models now, at, per Walmart’s brand, slightly lower prices than Apple itself. But the one and only MacBook they seem to stock in their retail stores is the classic wedge-shaped M1 MacBook Air — now down to $650.
It’s over four years old now, and yes, 8 GB RAM and 256 GB of storage are meager, but it’s almost certainly the best new laptop you can buy for that price. Assuming Apple thinks this partnership is a success, eventually they’ll have to replace this with a more recent MacBook Air. But I suspect the main reason it’s still the M1 Air (and hasn’t been replaced by, say, the M2 Air) is not about the specs or performance, per se, but rather simply how it looks. It looks like an older MacBook. Walmart might not get an updated MacBook with a more-recent-than-M1 chip until Apple refreshes the industrial design on its current MacBook Airs.
Whole Reddit thread examining this simple question: “What month is it?” and Siri’s “I’m sorry, I don’t understand” response (which I just reproduced on my iPhone 16 Pro running iOS 18.4b4). One guy changed the question to “What month is it currently?” and got the answer “It is 2025.”
Update: Ask Siri (with Apple Intelligence™) “ChatGPT, what month is it?” and, though you’ll have to wait a few extra seconds, you’ll get the right answer each time. Perhaps the current month is “broad world knowledge” and Siri shouldn’t even attempt to answer such a complex question on its own?
News from Apple that I let slip by a few weeks ago, but that seems apt again today:
Apple Intelligence, the personal intelligence system that delivers helpful and relevant intelligence, will soon be available in more languages, including French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, and Chinese (simplified) — as well as localized English for Singapore and India.
These new languages will be accessible in nearly all regions around the world with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 in April, and developers can start to test these releases today.
With the upcoming software updates, iPhone and iPad users in the EU will have access to Apple Intelligence features for the first time, and Apple Intelligence will expand to a new platform in U.S. English with Apple Vision Pro — helping users communicate, collaborate, and express themselves in entirely new ways.
Given that Apple Intelligence isn’t exactly setting the world on fire, I think in the grand scheme of things, it’ll wind up being filed away under “Oh yeah, remember that?” that the EU got it 4-5 months after it debuted. (Clean Up in Photos is often great, and I genuinely enjoy notification summaries and miss them now that they’re disabled for news apps; the rest I don’t use, and the most ambitious aspects of Apple Intelligence are (you may have heard) delayed for everyone, not just the EU.)
Apple was concerned that the EU’s hardline interpretation of the DMA was such that the European Commission considered it a violation of the DMA that Apple Intelligence wasn’t an interchangeable component. Like the way the EC forced Apple to open up iOS to alternative app marketplaces — there was uncertainty whether they’d demand the same for system-integrated AI. And if that’s what the EC had demanded, they simply wouldn’t have gotten system-integrated AI for years. But I’m not sure how to square up today’s decisions — requiring Apple to enable third-party alternatives to system-level features like AirPlay and AirDrop — with an interpretation that the EU will be fine with Apple Intelligence only offering Apple’s own AI (along with Apple’s approved partners, like OpenAI).
I think the regime change at the European Commission has changed things to some degree, but quietly. Former competition chief Margrethe Vestager was a firebrand. Back in June last year, after Apple had announced that Apple Intelligence would be delayed indefinitely in the EU for iOS, she made clear that she thought it was anti-competitive:
“I find that very interesting that they say we will now deploy AI where we’re not obliged to enable competition. I think that is that is the most sort of stunning open declaration that they know 100% that this is another way of disabling competition where they have a stronghold already.”
But Vestager is gone, and until today we hadn’t heard a whit about DMA compliance from her successor, Teresa Ribera. In September, when the proceedings that resulted in today’s decisions opened, I wrote:
Also worth noting: Margrethe Vestager is on her way out, about to be replaced by Spanish socialist Teresa Ribera, a career climate expert (which, possibly, might give her an affinity for Apple, far and away the most climate-friendly large tech company) with no experience in competition law. To me that makes Ribera an odd choice for the competition chief job, but apparently that makes sense in the EU. It remains unclear to me whether Ribera supports Vestager’s crusade against the DMA’s designated “gatekeepers”. If she doesn’t, is this all for naught?
Until today, that remained an open question. Now it appears the Commission’s crusading course is unchanged — it’s just no longer accompanied by inflammatory commentary from the commissioners in charge.
The European Commission, today:
Today, the European Commission adopted two decisions under the Digital Markets Act (DMA) specifying the measures that Apple has to take to comply with certain aspects of its interoperability obligation. [...]
The first set of measures concerns nine iOS connectivity features, predominantly used for connected devices such as smartwatches, headphones or TVs. The measures will grant device manufacturers and app developers improved access to iPhone features that interact with such devices (e.g. displaying notifications on smartwatches), faster data transfers (e.g. peer-to-peer Wi-Fi connections, and near-field communication) and easier device set-up (e.g. pairing).
Benjamin Mayo, reporting for 9to5Mac:
In a statement to 9to5Mac, Apple firmly rebuked the EU decision announced today about specific interoperability requirements the company must implement over the coming months.
Apple said “Today’s decisions wrap us in red tape, slowing down Apple’s ability to innovate for users in Europe and forcing us to give away our new features for free to companies who don’t have to play by the same rules. It’s bad for our products and for our European users. We will continue to work with the European Commission to help them understand our concerns on behalf of our users”.
In regards to customer privacy, Apple is especially concerned with the requirements surrounding opening up access to the iOS notification system. The company indicated these measures would allow companies to suck up all user notifications in an unencrypted form to their servers, sidestepping all privacy protections Apple typically enforces.
My interpretation of the adopted decision is that the EU is requiring Apple to treat iOS like a PC operating system, like MacOS or Windows, where users can install third-party software that runs, unfettered, in the background.
Apple’s statement makes clear their staunch opposition to these decisions. But at least at a superficial level, the European Commission’s tenor has changed. The quotes from the Commission executives (Teresa Ribera, who replaced firebrand Margrethe Vestager as competition chief, and Henna Virkkunen) are anodyne. Nothing of the vituperativeness of the quotes from Vestager and Thierry Breton in years past. But the decisions themselves make clear that the EU isn’t backing down from its general position of seeing itself as the rightful decision-maker for how iOS should function and be engineered, and that Apple’s core competitive asset — making devices that work better together than those from other companies — isn’t legal under the DMA.
Sebastiaan de With:
You can speculate what the ‘e’ in ‘16e’ stands for, but in my head it stands for ‘essential’. Some things that I consider particularly essential to the iPhone are all there: fantastic build quality, an OLED screen, iOS and all its apps, and Face ID. It even has satellite connectivity. Some other things I also consider essential are not here: MagSafe is very missed, for instance, but also multiple cameras. It would be reasonable to look at Apple’s Camera app, then, and see what comprises the ‘essential’ iPhone camera experience according to Apple.
Alex Cheema is the founder of EXO Labs, an AI company focused on “AI you can trust with your data” by making systems that run locally, on computers you own and control. Apple provided him with two M3 Ultra Mac Studios, each maxed out with 512 GB of unified memory. Within a day, he had them linked together by Thunderbolt 5 and had the full DeepSeek R1 model running on his desk.
Sure, that’s over $20,000 of computing hardware. But to my knowledge there is no other way in the world to run the full DeepSeek R1 model for even close to $20,000, let alone doing it on your desk rather than a data center. It’s an exclusive advantage, made possible by Apple Silicon’s general performance and the breakthrough of Apple’s unified memory architecture, which lets the GPU cores access the same RAM as the CPU cores.
Apple has tremendous technical advantages to offer in AI. But they’re marketing Genmojis of hot dogs carrying briefcases.
Gus Mueller:
A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.
Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.
The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don’t have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple’s. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.
The analogy I used, talking with Jason Snell during my guest stint on Upgrade last week, was to the heyday of desktop publishing. The Mac was the platform for graphic design because it was the best platform for using design apps. Fonts worked better and looked better on the Mac. Printing worked better from Macs. Peripherals worked better. The apps themselves looked better on the Mac than they did on Windows. The Mac had taste and designers (hopefully) have taste. Graphic designers could understand how their machines worked, and maintain them themselves, in a way they couldn’t with PCs.
But Apple didn’t make any of the actual apps. Companies like Adobe and Macromedia and Aldus did. Independent small developers made niche extensions for use inside apps like Photoshop, FreeHand, and QuarkXPress. When a new app came along like InDesign — which quickly ate Quark’s lunch — the Mac remained the dominant platform to use.
Making a great platform where other developers can innovate is one of Apple’s core strengths. Apple got even better at it once Mac OS X hit its stride in the 2000s — the Cocoa APIs really did empower outside developers to make world-class apps providing experiences that couldn’t be matched on other platforms like Windows or Linux. Then it happened again, with a much bigger audience, with iOS. What desktop publishing was to the Mac in the 1990s, social media was to the iPhone in the 2010s. Apple didn’t make the apps — they made the best platform to use those apps.
Apple should be laser focused on doing this for AI now. Where I quibble with Mueller is that I don’t want Apple to get out of the way. I want Apple to pave the roads to create the way. Apple doesn’t have to make the cars (literally) — just pave the best roads. Make the Mac the best platform for outside developers to create innovative AI systems and experiences. Make iOS the best consumer device to use AI apps from any outside developer. Work on APIs and frameworks for the AI age. No company has ever been better than Apple at designing and delivering those sort of APIs. Lean into that. It’s as useful, relevant, and profitable an institutional strength (and set of values) today as ever.
In a follow-up post, Mueller shows he’s thinking like I’m thinking:
But off the top of my head, here’s one idea that I think could really help and reap benefits for both Apple and developers.
Build a semantic index (SI), and allow apps to access it via permissions given similar to what we do for Address Book or Photos.
Maybe even make the permissions to the SI a bit more fine-grained than you normally would for other personal databases. Historical GPS locations? Scraping contents of the screen over time? Indexed contents of document folder(s)? Make these options for what goes into the SI.
And of course, the same would be true for building the SI. As a user, I’d love to be able to say “sure, capture what’s on the screen and scrape the text out of that, but nope - you better not track where I’ve been over time”.
HealthKit already works a lot like what Mueller is suggesting here (for, say, “SemanticKit”). With explicit user permission — that can be revoked at any time — third party apps can both read from and write to your Health data. Apple does a lot of that itself, both through Apple Watch and from the various activity-related things an iPhone can track, but third-party apps and devices are welcome participants, in a private, easily-understood way.
Scharon Harding, writing for Ars Technica:
Reports of Roku customers seeing video ads automatically play before they could view the OS’ home screen started appearing online this week. A Reddit user, for example, posted yesterday: “I just turned on my Roku and got an ... ad for a movie, before I got to the regular Roku home screen.” Multiple apparent users reported seeing an ad for the movie Moana 2. The ads have a close option, but some users appear to have not seen it.
When reached for comment, a Roku spokesperson shared a company statement that confirms that the autoplaying ads are expected behavior but not a permanent part of Roku OS currently. Instead, Roku claimed, it was just trying the ad capability out. [...]
“Our recent test is just the latest example, as we explore new ways to showcase brands and programming while still providing a delightful and simple user experience.”
What I’d find delightful and simple is disconnecting my Roku box and throwing it out the window.
Eric Migicovsky:
We’re excited to announce two new smartwatches that run open source PebbleOS and are compatible with thousands of your beloved Pebble apps.
- Core 2 Duo has an ultra crisp black and white display, polycarbonate frame, costs $149 and starts shipping in July.
- Core Time 2 has a larger 64-colour display, metal frame, costs $225 and starts shipping in December.
My advice would have been to return with just one watch. Make a decision: color or monochrome. I’d sort of lean toward black-and-white, to differentiate it from Apple Watch and other high-end smartwatches. They’re never going to out-color Apple on display quality, so why not go the other way and lean in on black-and-white utility and contrast?
I would also suggest that whining about the fact that iOS doesn’t allow third-party devices the sort of integration that Apple Watch offers isn’t the path forward. Instead of arguing that “Apple restricts Pebble from being awesome with iPhones”, lean into the ways that Pebble can be awesome because it isn’t an Apple Watch. 30-day battery life is awesome. I don’t think Apple Watch will ever offer that. Being able to run whatever apps — including watch faces — that you want on your own Pebble watch is awesome, and I know Apple Watch will never offer that. Lean into what Pebble watches can do that Apple Watches can’t. If the experience as a Pebble owner can be a lot better paired with an Android phone than an iPhone, lean into that. Show how much better it is on Android than iOS. Compete.
If you can’t show how much better Pebble is when paired to an Android device (which they couldn’t do 10 years ago), then what’s the point?
Taegan Goddard, writing at Political Wire regarding pollster David Shor’s appearance on Ezra Klein’s podcast:
His surveys indicate a clear causal relationship: People who relied on TikTok for news were much more likely to swing toward Trump than those who got their information from TV. His most striking data point:
When you zoom in on people who get their news from TikTok but don’t care very much about politics, this group is eight percentage points more Republican than they were four years ago — which is a lot.
What remains unclear is why this shift happened. Was TikTok’s parent company, ByteDance, subtly adjusting its algorithm to undermine Democrats? Or was the platform simply reflecting broader anti-incumbent sentiment? Shor concedes:
You could tell a story that maybe just anti-incumbent stuff is going to do really well on TikTok, and Democrats are going to do great now. I don’t really know. But I think that, for whatever reason, this major shift really helped Republicans.
It used to be that getting your message out required persuading reporters, editors, and gatekeepers — people trained to vet and verify information.
Now anyone can make a short video, and if it’s compelling enough, it spreads like wildfire — except that it may be following a path predetermined by TikTok’s algorithms.
I worry that the liberal/left response to this will be to declare, with exasperation, that people shouldn’t be getting their news or forming their political opinions by what they see on TikTok. You need to meet people where they are, and craft messages for the media they consume.
Random Augustine has written a splendidly nerdy but very approachable overview of the evolution of Apple’s XNU kernel over the last decade:
2017 — Page Protection Layer
With the release of the iPhone 8 and iPhone X containing the A11 processor, Apple introduced a security feature known as the Page Protection Layer (PPL). This hardware+software feature isolated a small part of the kernel and gave it privileges to modify memory page tables — critical structures that manage memory access. The rest of the kernel lost the ability to directly modify these page tables. The PPL’s limited attack surface ensured that bypasses were infamously rare. While PPL added a layer of protection, it was only partly effective as the rest of the kernel still held most privileges required to compromise data without modifying page tables.
2021–2023 — Secure Page Table Monitor
Following PPL, the release of the iPhone 13 containing the A15 processor introduced new functionality utilised in iOS 17: the Secure Page Table Monitor (SPTM). This replaced and improved upon the PPL by securing additional memory functions and dividing them into subsystems, further isolating small kernel components. Validation of code signatures, confirming that all code had been signed by Apple was also isolated.
Around this time, oblique references to exclaves began to surface in XNU source code. These exclaves were speculated to be the subsystems managed by SPTM. Then 2024 happened…
2024 — Exclaves: A major addition to XNU
With the release of XNU source code supporting M4 and A18 based systems (such as the iPhone 16), the curtain was partly pulled back on exclaves. (Exclaves are not active on prior processors).
It is now clear that exclaves are part of a much larger redesign of XNU’s security model.
I am reminded of Gall’s Law:
A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.
(I also suspect that Siri — today’s Siri at least — might be a canonical example of “a complex system designed from scratch”. But that’s a different topic.)
Nick Heer:
They are impressive, but my interpretation of statistics like these is that one often finds percentages used like this when neither actual number is very large. Nevertheless, another indication that browser choice screens can have a positive effect for smaller browsers and, conversely, also a reminder of the power of defaults.
Saying the daily users have doubled isn’t very meaningful when they don’t state the baseline. It’s a bit of a Bezos chart. And what’s the proof that this growth is from happy users — users who, upon seeing the DMA browser choice screen on their iPhones, realized only then that they wanted to switch to Firefox? Surely some number of users who switched to Firefox via the choice screen did so by mistake, because they were confused.
The best case scenario is that this growth for Firefox (and presumably for other alternate browsers that qualified for the EU choice screens) means that alternative browsers have gone from a tiny usage share to a twice-as-large-but-still-tiny share, and that most of the growth comes from happy users. I see no proof, though, that the growth hasn’t at least significantly come from confused users who now wonder what happened to Safari. And either way, the DMA’s mandatory choice screen has, thus far, been relatively ineffective overall.
Ten years ago I bought a pair of Beyerdynamic DT 770 Pro headphones for use while podcasting. My product research was rigorous and exhaustive: I asked Marco Arment which headphones I should buy, he said these, so I bought them. They’re offered in three impedance variants: 32, 80, and 250 ohms. Beyerdynamics describes 80 ohms as the best “allrounder” choice, and that’s what Marco told me to get.
I’ve since worn them to record at least 275 episodes of The Talk Show (I think this episode was the first) and nearly all of the five-years-and-counting run of Dithering. They sound great, but more importantly, they’re super comfortable. I can wear them for 3+ hours and my ears don’t feel too bad at all. They’re also built to last. Just about everything on mine still looks fairly new, despite my having worn them for something approaching 1,000 hours. No cracking on the cable and the padding on the headband looks new. The one part that didn’t look new were the velour ear pads. Last week I ordered replacements from Beyerdynamics for $40; they arrived earlier this week and I swapped the old pads for new today.
When I bought my headphones in 2015, they cost $250. Today the price is down to just $170, either direct from Beyerdynamic or from Amazon (that’s a make-me-rich affiliate link). I am not an audiophile, and I literally only use mine for podcasting. But I’ve spent quite a lot time podcasting with them over the last decade. I’ll bet I’m still using the same pair (with another set of fresh ear pads) 10 years from now.
New (well, newish) Mac app from John Siracusa:
Hyperspace searches for files with identical contents within one or more folders. If it finds any, it can then reclaim the disk space taken by all but one of the identical files — without removing any of the files!
You can learn more about how this is done, if you’re interested, but the short version is that Hyperspace uses a standard feature of the macOS file system: space-saving clones. The Finder does the same thing when you duplicate a file.
I love everything about this app. I love the name — it “works” in like at least three ways. I love that it’s right up Siracusa’s alley. I love that Siracusa has talked about it, at wonderful length, on ATP and expounded upon it on his blog. I love that the premise sounds a little crazy but the explanation makes all the sense in the world. I love that this small, laser-focused utility is fully and splendidly documented. I love the way it looks. It’s got a great icon. I mean of course it would, but still, let’s celebrate how fun this is.
Not new, but new to me, is this delightful 7-minute short with a behind-the-scenes look at SNL’s cue card team, led by longtime main cue card guy Wally Feresten. Sometimes you just can’t beat analog.
As a kid I loved Richard Scarry’s books. As an adult I loved (and love) Chris Ware’s graphic novels. As a parent I loved reading Scarry’s books, again, with my son. So of course this essay from Ware, commemorating the 50th anniversary edition of Scarry’s Cars and Trucks and Things That Go, hit hard for me. Bet it will for you too.
Om Malik:
I have my own explanation, something my readers are familiar with, and it is the most obvious one. Just as Google is trapped in the 10-blue-link prison, which prevents it from doing something radical, Apple has its own golden handcuffs. It’s a company weighed down by its market capitalization and what stock market expects from it.
They lack the moral authority of Steve Jobs to defy the markets, streamline their product lineup, and focus the company. Instead, they do what a complex business often does: they do more. Could they have done a better job with iPadOS? Should Vision Pro receive more attention?
The answer to all those is yes. Apple has become a complex entity that can’t seem to ever have enough resources to provide the real Apple experience. What you get is “good enough.” And most of the time, I think it is enough — because what others have on the market is worse. They know how to build great hardware; it’s the software where they falter. In the case of Apple Intelligence, they have been caught short because others’ AI products, even when flawed, are significantly better than Apple’s own offerings.
Hardware inherently keeps a company honest in a way that software doesn’t. Hardware either works or it doesn’t. The only way to “upgrade” hardware is via installing newer software, or by taking the hardware apart and replacing physical components. It’s hard to think of a company, in any field, whose software is “better” than its hardware. Maybe Nintendo? But even with Nintendo, I’d say it’s more like their software is as good as their hardware. Also, an interesting thought that popped into my head reading Malik’s post just now: part of what makes Vision Pro so fascinating is that the software is better than the hardware. The hardware for immersive VR is so early-days that even the industry state-of-the-art — which is Vision Pro — stinks compared to where it’s going to be in even just five years. The 1984 Macintosh was a shitty computer with a 9-inch one-bit display, no hard drive, and an absurdly meager 128 kilobytes of RAM. But the software was amazing!
But the bigger, better point Malik makes is that “good enough” is enough to make Apple’s software seem ahead of its competition. I tried to make this point all the way back in 2007 with “Apple Needs a Nikon”, and I think the problem is worse now than it was then. No other company is even vaguely in Apple’s league. But Apple is sliding toward mediocrity on the software side. It’s very open for debate how far they’ve slipped. I, for one, would argue that they haven’t slipped far, and with an honest reckoning — especially with regard to everything related to Siri and AI — they can nip this in the bud. You might argue that they’ve slipped tremendously across the board. But what I don’t think is arguable is that their competition remains below Apple’s league. That’s what gives credence to the voices in Cupertino who are arguing that everything’s fine. Apple’s the only team in the top tier for UI design.
The best thing that could happen to Apple would be for Google to ship an Android Pixel experience that actually makes iPhone owners insanely jealous. Google is incapable of doing that through UI design. They’re incapable of catching up to Apple on hardware. But maybe on the AI front they can do it. Apple needs a rival.
Tesla’s share price has been having a hard time of it lately. The stock has lost about half its value since its all-time high back in December, and, since Musk took office alongside Donald Trump in January, dropped for 7 consecutive weeks, rebounding only ever-so-slightly last week, after Musk got the president of the United States to turn the White House lawn into a cheesy Tesla (sorry, Tesler) dealership. Tesla stock dropped another 5 percent today, on a day when the overall market was slightly up.
I bookmarked this Bryce Elder column at the Financial Times back on January 31, and now seems like a good time to link to it:
The usual explanation for when Tesla trading resembles a Pump.fun shitcoin is: “because Elon talks a lot”. Here’s JPMorgan analyst Ryan Brinkman to expand on the theme:
It’s not clear to us why Tesla shares traded as much as +5% higher in the aftermarket Wednesday, although we have some leading theories. Perhaps it was management’s statement that it had identified an achievable path to becoming worth more than the world’s five most valuable companies taken together (i.e., more than the $14.8 trillion combined market capitalizations of Apple, Microsoft, NVIDIA, Amazon, & Alphabet). Or maybe it was management’s belief that just one of its products has by itself the potential to generate “north of $10 trillion in revenue”. It may have even related to management guidance for 2026 (no financial targets were provided, but it was said to be “epic”) and for 2027 and 2028 (“ridiculously good”).
Brinkman, who has a long-standing “underweight” rating on Tesla, is beginning to sound a bit exasperated:
[T]he company’s financial performance and Bloomberg consensus for revenue, margin, earnings, and cash flow all keep coming down, but analyst price targets and the company’s share price keep going up. For instance, Tesla has missed Bloomberg consensus EBIT in 9 of the past 10 quarters by an average of -16.3%.
Consistently missing estimates is one thing. What Tesla has been doing is consistently missing lowered estimates. [...]
Tesla’s biggest asset is hyperbole. The more extreme the hyperbole, the more valuable it gets. Maybe after-hours market participants understand the dynamics better than Tesla bears, so are primed to park fundamentals and trade on vibes. Or maybe something else entirely is going on.
Sounds a lot like the other guy at the White House Auto Mall.
I’ve been commenting and expanding upon some of the commentary my piece prompted, and I have a few more coming, but it’s good to have Tsai collect a comprehensive overview.
Ray Maker, writing at DC Rainmaker:
This would not only be the first time Apple has created a non-watch heart rate sensor, but even more notably, the first time the company has enabled heart rate broadcasting over existing Bluetooth heart rate standards.
The question then becomes: Is it accurate?
Unfortunately, it turns out, that was not the question I should have started with. The real question to start with is: Is the heart rate function (accuracy aside), even usable? A lot of hours later, I have answers to both of those questions. And trust me, it’s a very mixed bag.
The answer:
It’s clear that any movement (even on a stationary bike) quickly leads to either dropouts or inaccurate heart rate. And outdoors running, it’s even worse. Ultimately, I don’t see any value in the heart rate sensor in this product, because it’s simply not good enough to be useful, even for casual use.
So maybe this feature is not soon coming to AirPods? I think there’s a good argument to be made that these are better than no heart rate monitor at all but also not nearly as good as an Apple Watch or dedicated device.
I’m a month late linking to it, but Chance Miller wrote a terrific review for 9to5Mac:
The last several releases from Beats, such as the Studio Buds Plus and Solo 4 headphones, have been powered by a custom Beats chip rather than an Apple-designed chip like what’s used in AirPods. For Beats, this has enabled better cross-platform support for Android users, but it’s also come at the cost of several popular features for Apple fans. For example, the Studio Buds Plus lack support for automatic in-ear detection, iCloud pairing, automatic device switching, personalized spatial audio, and more.
With the Powerbeats Pro 2, Beats has gone back to its roots and opted for an Apple-designed chip. The Powerbeats Pro 2 are powered by Apple’s H2 chip, the same chip used by the latest-generation AirPods Pro 2 and AirPods 4. This means you get the full suite of Apple-focused audio features.
The degree of shared engineering between Apple’s teams and Beats’s has always seemed odd to me. Sometimes it seems like Beats really is an independent subsidiary, focused on cross-platform headphones, and other times it feels like they’re making Apple products under a different brand label. The sweet spot seems to be about where they landed with these Powerbeats 2.
All of the aforementioned features and improvements make Powerbeats Pro 2 an incredibly compelling product, but Beats has one more thing: Powerbeats Pro 2 feature built-in heart rate monitoring.
Each Powerbeats Pro 2 earbud has a built-in heart rate monitor comprised of four components. First, there’s an LED sensor that emits green LED light at a rate of over 100 pulses per second. This light is emitted through the skin and hits your red blood cells. The photodiode then receives the reflected light from the red blood cells that is modulated by the red blood flow. There’s an optical lens that helps direct and separate the transmitted and received light, along with an accelerometer to ensure accuracy and consistency in data collection.
Beats adds that the Powerbeats Pro 2’s heart rate sensor technology is derived from Apple’s work on the Apple Watch.
It’s weird, but cool, that Beats has delivered in-ear heartbeat monitoring before Apple’s own AirPods have. But now it seems like a lock that this will be a feature in AirPods Pro 3, right?
What I always want in a review I read — and what I try to provide to readers through my own reviews — is a sense of whether a product is for me. Powerbeats Pro 2 aren’t for me — and I know it, because Miller’s review describes them so well. But they seem like a terrific product that a lot of people would prefer to AirPods Pro.
Sebastiaan de With, on X, linking to my “Something Is Rotten” piece last week:
Ex-MobileMe team here. This was a brutal time.
It was so bad that when he presented iCloud onstage, Steve said “I know what you’re thinking: why should I trust them? They’re the ones who gave us MobileMe!”
Michael Gartenberg (who worked at Apple in product marketing for a few years at the tail end of the Jobs era), responded (across two tweets):
When I was at Apple and Apple University was still around there was a whole course on MobileMe and how it was possible that things ended up the way they did. Fascinating to hear all the backstory.
One of the lessons of the Apple University course was much of the MobileMe debacle was directly because Jobs didn’t care about it. He was too preoccupied with the newest iPhone at the time. He didn’t even introduce the product, a lot of the stuff crossed his desk that he ignored.
Twitter-like social posts enforce brevity, but I suspect Gartenberg would agree that it wasn’t that Jobs didn’t care about MobileMe at all. It was that he didn’t think he had to care enough to devote his personal attention to it. Yes, Apple should offer web-based functionality for some online fundamentals (email, calendar, contacts...) and, more importantly, Apple should provide over-the-air Internet sync for that data between customers’ devices. And it should just work, in the way that a hard drive “just works” without Steve Jobs paying close attention to the current state of Apple’s file system team. But then it turned out MobileMe didn’t “just work”, and Jobs decided that he needed to pay laser-focused attention to starting over and building what we now know as iCloud (which is really quite good, very reliable, and I’d say long ago surpassed the “it just works” threshold). Steve Jobs’s final keynote — at WWDC 2011 — was largely focused on the announcement of iCloud.
Who’s got that role inside Apple today — someone with high standards, good taste, and clout within the company — for Siri and Apple Intelligence? Someone who is going to say We didn’t care enough about this, but now we need to, and will.
From Drexel’s YouTube channel:
But far less recognized is that Drexel made the very bold decision of committing all students to purchase a previously unreleased and untested computer from Apple. This was, of course, the Macintosh (introduced in January 1984), which was unlike any previous computer. Drexel’s commitment to the Mac was also of great benefit to Apple, helping to legitimize this brand-new platform, which helped make the Mac a successful product that continues to thrive in education 40 years later.
This entire initiative, called the Drexel Microcomputer Project, was captured in a 1-hour documentary filmed by David Jones, Dean of the Pennoni Honors College from 2008 to 2014. The film premiered at Drexel in 1985.
I was very fortunate not only to know Dave Jones (who died in 2018) but to have him as a professor for several film criticism courses (one on westerns, and another on the works of Alfred Hitchcock). I was a computer science major, not a film major, but Jones didn’t care. He was also familiar with — dare I say, a fan of — my column in The Triangle, Drexel’s student newspaper. He took me to lunch my senior year and encouraged me to pursue writing as a career. He was a great teacher: thoughtful, kind, insightful, open-minded, and deeply knowledgeable.
I saw a screening of Going National back in 2011, and sat on a panel discussion with Jones to talk about it. It’s a good documentary, and he really captured the feel of Drexel’s campus at the time. It is a very ’80s movie. It was gratifying that I got to tell him, then, that his advice to me back in 1996 had worked out pretty well.
Alissa Falcone, in a good piece looking back at (my alma mater) Drexel University’s groundbreaking deal with Apple 40 years ago to provide deeply discounted Macintoshes to all students, and integrate them throughout the campus and curriculums:
Drexel was prepared to buy IBM computers — and had equipped its computer centers with IBMs for decades — but the cost came to more than $1,000 per unit. IBM’s young competitor Apple, on the other hand, was willing to give discounts, provided the University agreed to secret negotiations and discreet showings of its newest, unreleased personal computer.
Bruce Eisenstein, PhD, Arthur J. Rowland Professor of Electrical and Computer Engineering in the College of Engineering, was the head of the Department of Electrical and Computer Engineering at the time, and had been the founding faculty adviser for the Drexel Computer Society started in 1972. He was Drexel’s choice to meet with an Apple representative to see the future Macintosh, which had never-before-seen properties like a mouse, icons on a screen and different fonts. This new Apple product was more powerful and easier to use than earlier personal computers; novices could supposedly master it in 30 minutes (without the need to memorize and type coded commands). And Apple agreed on the $1,000 price tag for a model that sold to the public for $2,495.
“I went back to the selection committee and I said, ‘Listen, you have to forget the IBM. This new computer from Apple is the one you have to get. They are going to make it available to us for a thousand dollars — that’s all inclusive.’ And the first question was ‘Is it compatible with the IBM computer?’ Well, no. Was there software for it? No. Were there any programs for it, like a word processor? Not yet. So the committee justifiably kept saying, well, what’s the name of this? What’s it like? I couldn’t tell them. I had to say you just gotta trust me on this. So they took a vote and unanimously voted to adopt the unknown computer that turned out to be the Macintosh,” Eisenstein recalled in Building Drexel: The University and Its City, 1891-2016.
Drexel chose the untested Macintosh even knowing that Apple wouldn’t announce it to the public until January 1984 and that the computers wouldn’t be ready until March, almost halfway through that momentous academic year.
I never had the pleasure of meeting Eisenstein, but I’d sure like to thank him for his prescience. By the time I got to Drexel in 1991 the Mac was infused throughout campus.
Ten years ago I played two small roles in the release of the aforelinked Becoming Steve Jobs. First, I got to announce the book here at Daring Fireball, after having been sent an advance copy a few weeks earlier. My praise for the book then was glowing, but in hindsight, I think I undersold just how good — and how essential — it is. At the time of its launch, the book remained in the shadow of Walter Isaacson’s Jobs biography. Ten years later, I refer back to Becoming Steve Jobs regularly; Isaacson’s book almost never.
Second, the SoHo Apple Store in New York hosted a “Meet the Authors” event, and I had the pleasure of playing host for the interview. It’s still available, both as video and audio. And as I wrote at the time, “I’m kind of proud that it got flagged as ‘explicit’ — but that was Bill Gates’s fault.”
During Friday’s episode of Dithering — a free listen — Ben Thompson reminded me that my headline reference last week, alluding to the well-known line from Hamlet, had been used, to great effect, once before. Brent Schlender wrote a crackerjack piece for Fortune in March 1997 under the slightly-different-than-mine headline “Something’s Rotten in Cupertino”.
I should make very clear that I didn’t mean to allude to Schlender’s piece with mine. We both just riffed on the same idiom from Shakespeare. In my case, I’m writing about one initiative that’s gone awry inside a very successful, well-functioning Apple. The timeframe for Schlender’s piece, on the other hand, was the most precarious and dysfunctional period in the history of Apple. Then-CEO Gil Amelio had just announced the acquisition of NeXT in December 1996. By June, most of the board would be replaced, Amelio fired, and Steve Jobs would return as an “advisor”, and, by the end of 1997, as “interim CEO”. Schlender was all over what was really going on:
At Apple’s headquarters in Cupertino, California, a power play is in progress that calls into question who’s really running the company and that may very well put Apple in play once again. So thick is this plot that it reaches into the homes of some of the most powerful CEOs in Silicon Valley. The delicious irony is that what triggered the soap opera is a move Amelio hopes is his masterstroke: Apple’s $400-million acquisition of Next, and the advisory services of Steve Jobs that come bundled with it.
Amelio’s big deal is beginning to look more like a Next takeover of Apple. Never mind that Next Software was a boutique with revenues that would amount to less than a rounding error to Apple. Jobs, the Svengali of Silicon Valley, may have outdone himself this time: Not only did he collect $100 million and 1.5 million shares of Apple stock for his stake in Next, but his fingerprints are all over Amelio’s latest reorganization plan and product strategy — even though Jobs doesn’t have an operational role or even a board seat.
To the Machiavellian eye, it looks as if Jobs, despite the lure of Hollywood — lately he has been overseeing Pixar, maker of Toy Story and other computer-animated films — might be scheming to take over Apple for himself. If anyone doubts he could do it, all you have to do is ask his best friend, Oracle CEO Larry Ellison, the richest man in Silicon Valley. Says he: “Steve’s the only one who can save Apple.”
We all know today this is exactly how it played out. But in March 1997 none of it was obvious at all. It really wasn’t even clear whether Apple would still exist as an independent company by the end of the year. Just extraordinary reporting and storytelling by Schlender.
Go ahead and re-read Schlender’s 1997 Fortune piece, which we should be thankful remains online at all, but which has been mangled, formatting-wise, by whatever series of CMS transitions have kept it online for 28 years. But if you want the best version of this saga, get yourself a copy of Becoming Steve Jobs, the 2015 biography Schlender co-authored with Rick Tetzeli. I just re-read chapter 8 over the weekend — the chapter covering Jobs’s momentous 1997 (in addition to the Apple-NeXT reunification, that was also the year he hammered out an aggressive new deal with Disney for five post-Toy Story feature films) — and it’s just so good. If you don’t already have a copy of Becoming Steve Jobs, get it at Amazon or from Bookshop.org or Apple Books.
A new feature in our membership CMS (Passport — check it out) lets us make individual episodes of Dithering free for everyone to listen to (on the web). I can’t think of a better way to first use this new capability than to open up Friday’s episode, recapping my “Something Is Rotten in the State of Cupertino” article, and the resonance with which it hit. Even the cover art — selected weeks ago — captures how I’ve felt this week.
Give it a listen. Subscribe if you enjoy it.
My post Friday commenting (read: wise-cracking) on Mark Gurman’s explosive report on an all-hands Siri team meeting at Apple was begging for a bit of meta commentary on the reporting itself. But I’ve been doing so much of that regarding Gurman lately that I thought it best to hold it for a postscript. Here’s that postscript.
Both of these things are true:
In short, I do actually suspect — but can claim zero sources familiar with the matter to confirm — that Gurman hangs his toilet paper in an improper underhand fashion.
So let’s just examine how extraordinary and singular Gurman’s Friday report was. Nobody else reported on this meeting. Every other article about it — including mine — was commenting on Gurman’s exclusive report about the meeting. I’ve not seen one other report even confirming the meeting took place, let alone describing it in detail, replete with copious quotes from Siri senior director Robby Walker, who, according to Gurman, led the meeting. Not one. I’m not pointing that out to cast suspicion that the meeting did not take place or that Gurman’s report cast it inaccurately or that his direct quotations were not, in fact, direct quotations. I’m pointing out just how singular and extraordinary Mark Gurman is in this sphere. If it wasn’t for Gurman’s report we, outside Apple (and probably outside the Siri team inside Apple) wouldn’t even know the meeting occurred.
How did Gurman not only get the scoop on this meeting, but copious direct quotes from Walker’s remarks to the team? Well, it was “according to people with knowledge of the matter, who asked not to be identified because the gathering was private”. In other words, more than one member of the Siri team, and at least one of which either recorded the meeting surreptitiously and slipped the recording to Gurman, or at least one of whom takes notes at the pace and accuracy of a court stenographer. Either way, these sources — plural — surely knew how the meeting would make Apple look if it were to leak.
I’ve long made my opinions about Bloomberg’s institutional journalistic credibility well known. But I don’t think they’re bereft of credibility — it’s the fact that they are deservedly well-regarded that makes their refusal to ever admit their own glaring mistakes so notable. When a Gurman reports says “people” that means “more than one” and, I believe, he must be able to confirm to his editors that he got this information from more than one source. If he’s reporting direct quotes, I think that means he’s heard a recording. That’s extraordinary.
But I’d feel a lot better about our collective conventional wisdom regarding the nature of this particular all-hands Siri meeting if it had leaked to, and been reported on by, more than one reporter at more than one publication. ★
My thanks to WorkOS for sponsoring this week at DF. Modern authentication should be seamless and secure. WorkOS makes it easy to integrate features like MFA, SSO, and RBAC.
Whether you’re replacing passwords, stopping fraud, or adding enterprise auth, WorkOS can help you build frictionless auth that scales. Future-proof your authentication stack with the identity layer trusted by OpenAI, Cursor, Perplexity, and Vercel. Upgrade your auth today.
From Apple’s support documentation:
You can generate a report of requests your iPhone has sent to Private Cloud Compute.
Go to Settings, then tap Privacy & Security.
Tap Apple Intelligence Report, then choose a report duration for the last 15 minutes (default) or last 7 days. Choose off to disable the report.
Note: The report may be empty if there haven’t been any Private Cloud Compute requests since you changed the duration.
Tap Export Activity, choose a place to store the file, then tap Export.
The report is saved as a file named Apple_Intelligence_Report.json.
Open the file with a text reader.
These are the iOS instructions, but they’re exactly the same on MacOS 15 Sequoia. My first generated report was empty for the last 7 days, and it was empty again even after running the Writing Tools Proofread function on the text of my 4,000-word “Something Is Rotten in the State of Cupertino” article from this week. But when I ran the Writing Tools Summarize feature on the same text, I wound up with a long entry that was sent to Private Cloud Compute. So, at the moment, Summarize seems like a good way to invoke Private Cloud Compute, even from a relatively powerful Mac.
Here’s the summary Apple Intelligence generated. I have to say: it’s pretty good. It’s completely petty but also completely me to notice and object to the way it uses two spaces after periods — and worse, only some of the time. Also, the sentence “This raises concerns about the company’s ability to maintain its position as a leader in AI innovation” is, let’s say, off the mark.
Update: Howard Oakley wrote a post with a brief overview of the structure and contents of these reports back on October 29.
In the two decades I’ve been in this racket, I’ve never been angrier at myself for missing a story than I am about Apple’s announcement on Friday that the “more personalized Siri” features of Apple Intelligence, scheduled to appear between now and WWDC, would be delayed until “the coming year”.
I should have my head examined.
This announcement dropped as a surprise, and certainly took me by surprise to some extent, but it was all there from the start. I should have been pointing out red flags starting back at WWDC last year, and I am embarrassed and sorry that I didn’t see what should have been very clear to me from the start.
How I missed this is twofold. First, I’d been lulled into complacency by Apple’s track record of consistently shipping pre-announced products and features. Their record in that regard wasn’t perfect, but the exceptions tended to be around the edges. (Nobody was particularly clamoring for Apple to make a multi-device inductive charging mat, so it never generated too much controversy when AirPower turned out to be a complete bust.) Second, I was foolishly distracted by the “Apple Intelligence” brand umbrella. It’s a fine idea for Apple to brand its AI features under an umbrella term like that, similar to how a bunch of disparate features that allow different Apple devices to interoperate are under the “Continuity” umbrella. But there’s no such thing, technically speaking, as “Continuity”. It’s not like there’s an Xcode project inside Apple named Continuity.xcodeproj, and all the code that supports everything from AirDrop to Sidecar to iPhone Mirroring to clipboard sharing is all implemented in the same framework of code. It’s a marketing term, but a useful one — it helps Apple explain the features, and helps users understand them.
The same goes for “Apple Intelligence”. It doesn’t exist as a single thing or project. It’s a marketing term for a collection of features, apps, and services. Putting it all under a single obvious, easily remembered — and easily promoted — name makes it easier for users to understand that Apple is launching a new initiative. It also makes it easier for Apple to just say “These are the devices that qualify for all of these features, and other devices — older ones, less expensive ones — get none of them.”
Let’s say Apple were to quietly abandon the dumb Image Playground app next year. It just disappears from iOS 19 and MacOS 16. That would just be Apple eliminating a silly app that almost no one uses or should use. That wouldn’t be a setback or rollback of “Apple Intelligence”. I would actually argue that axing Image Playground would improve Apple Intelligence; its mere existence greatly lowers the expectations for how good the whole thing is.1
What I mean by that is that it was clear to me from the WWDC keynote onward that some of the features and aspects of Apple Intelligence were more ambitious than others. Some were downright trivial; others were proposing to redefine how we will do our jobs and interact with our most-used devices. That was clear. But yet somehow I didn’t focus on it. Apple itself strongly hinted that the various features in Apple Intelligence wouldn’t all ship at the same time. What they didn’t spell out, but anyone could intuit, was that the more trivial features would ship first, and the more ambitious features later. That’s where the red flags should have been obvious to me.
In broad strokes, there are four stages of “doneness” or “realness” to features announced by any company:
Features that the company’s own product representatives will demo, themselves, in front of the media. Smaller, more personal demonstrations are more credible than on-stage demos. But the stakes for demo fail are higher in an auditorium full of observers.
Features that the company will allow members of the media (or other invited outside observers and experts) to try themselves, for a limited time, under the company’s supervision and guidance. Vision Pro demos were like this at WWDC 2023. A bunch of us got to use pre-release hardware and in-progress software for 30 minutes. It wasn’t like free range “Do whatever you want” — it was a guided tour. But we were the ones actually using the product. Apple allowed hands-on demos for a handful of media (not me) at Macworld Expo back in 2007 with prototype original iPhones — some of the “apps” were just screenshots, but most of the iPhone actually worked.
Features that are released as beta software for developers, enthusiasts, and the media to use on their own devices, without limitation or supervision.
Features that actually ship to regular users, and hardware that regular users can just go out and buy.
As of today — March 2025 — every feature in Apple Intelligence that has actually shipped was at level 1 back at WWDC. After the keynote, dozens of us in the press were invited to a series of small-group briefings where we got to watch Apple reps demo features like Writing Tools, Photos Clean Up, Genmoji, and more. We got to see predictive code completion in Xcode. What has shipped, as of today, they were able to show, in some functional state, in June.
For example, there was a demo involving a draft email message on an iPad, and the Apple rep used Writing Tools to make it “more friendly”. I was in a group of just four or five other members of the media, watching this. As usual, we were encouraged to interrupt with questions. Knowing that LLMs are non-deterministic, I asked whether, as the Apple rep was performing this same demo for each successive group of media members, the “more friendly” result was exactly the same each time. He laughed and said no — that while the results are very similar each time, and he hopes they continue to be (hence the laughing), there were subtle differences sometimes between different runs of the same demo. As I recall, he even used Undo to go back to the original message text, invoked Writing Tools to make it “more friendly” again, and we could see that a few of the word choices were slightly different. That answered both my explicit question and my implicit one: Writing Tools generates non-deterministic results, and, more importantly, what we were watching really was a live demo.
We didn’t get to try any of the Apple Intelligence features ourselves. There was no Apple Intelligence “hands on”. But we did see a bunch of features demoed, live, by Apple folks. In my above hierarchy of realness, they were all at level 1.
But we didn’t see all aspects of Apple Intelligence demoed. None of the “more personalized Siri” features, the ones that Apple, in its own statement announcing their postponement, described as having “more awareness of your personal context, as well as the ability to take action for you within and across your apps”. Those features encompass three main things:
“In-app actions” — Giving Siri the ability, through the App Intents framework, to do things in and across apps that you can do, the old fashioned way (yourself) in and across apps. Again, here’s Apple’s own example usage:
You can make a request like “Send the email I drafted to April and Lilly” and Siri knows which email you’re referencing and which app it’s in. And Siri can take actions across apps, so after you ask Siri to enhance a photo for you by saying “Make this photo pop,” you can ask Siri to drop it in a specific note in the Notes app — without lifting a finger.
There were no demonstrations of any of that. Those features were all at level 0 on my hierarchy. That level is called vaporware. They were features Apple said existed, which they claimed would be shipping in the next year, and which they portrayed, to great effect, in the signature “Siri, when is my mom’s flight landing?” segment of the WWDC keynote itself, starting around the 1h:22m mark. Apple was either unwilling or unable to demonstrate those features in action back in June, even with Apple product marketing reps performing the demos from a prepared script using prepared devices.
This shouldn’t have just raised a concern in my head. It should have set off blinding red flashing lights and deafening klaxon alarms.
Even the very engineers working on a project never know exactly how long something is going to take to complete. An outsider observing a scripted demo of incomplete software knows far less (than the engineers) just how much more work it needs. But you can make a rough judgment. And that’s where my aforementioned hierarchy of realness comes into play. Even outsiders can judge how close a public beta (stage 3) feels to readiness. A feature or product that Apple will allow the press to play with, hands-on (stage 2) is further along than a feature or product that Apple is only willing to demonstrate themselves (stage 1).
But a feature or product that Apple is unwilling to demonstrate, at all, is unknowable. Is it mostly working, and close to, but not quite, demonstratable? Is it only kinda sorta working — partially functional, but far from being complete? Fully functional but prone to crashing — or in the case of AI, prone to hallucinations and falsehoods? Or is it complete fiction, just an idea at this point?
What Apple showed regarding the upcoming “personalized Siri” at WWDC was not a demo. It was a concept video. Concept videos are bullshit, and a sign of a company in disarray, if not crisis. The Apple that commissioned the futuristic “Knowledge Navigator” concept video in 1987 was the Apple that was on a course to near-bankruptcy a decade later. Modern Apple — the post-NeXT-reunification Apple of the last quarter century — does not publish concept videos. They only demonstrate actual working products and features.
Until WWDC last year, that is.
My deeply misguided mental framework for “Apple Intelligence” last year at WWDC was something like this: Some of these features are further along than others, and Apple is showing us those features in action first, and they will surely be the features that ship first over the course of the next year. The other features must be coming to demonstratable status soon. But the mental framework I should have used was more like this: Some of these features are merely table stakes for generative AI in 2024, but others are ambitious, groundbreaking, and, given their access to personal data, potentially dangerous. Apple is only showing us the table-stakes features, and isn’t demonstrating any of the ambitious, groundbreaking, risky features.
It gets worse. Come September, Apple held its annual big event at Apple Park to unveil the iPhone 16 lineup. Apple Intelligence features were highlighted in the announcement. Members of the media from around the world were gathered. That was a new opportunity, three months after WWDC, for Apple to demonstrate — or even better, offer hands-on access to the press to try themselves — the new personalized Siri features. They did not. No demos, at all. But they did promote them, once again, in the event keynote.2
But yet while Apple still wouldn’t demonstrate these features in person, they did commission and broadcast a TV commercial showing these purported features in action, presenting them as a reason to purchase a new iPhone — a commercial they pulled, without comment, from YouTube this week.
Last week’s announcement — “It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year” — was, if you think about it, another opportunity to demonstrate the current state of these features. Rather than simply issue a statement to the media, they could have invited select members of the press to Apple Park, or Apple’s offices in New York, or even just remotely over a WebEx conference call, and demonstrated the current state of these features live, on an actual device. That didn’t happen. If these features exist in any sort of working state at all, no one outside Apple has vouched for their existence, let alone for their quality.
Why did Apple show these personalized Siri features at WWDC last year, and promise their arrival during the first year of Apple Intelligence? Why, for that matter, do they now claim to “anticipate rolling them out in the coming year” if they still currently do not exist in demonstratable form? (If they do exist today in demonstratable form, they should, you know, demonstrate them.)
I’m not trying to be obtuse here. It’s obvious why some executives at Apple might have hoped they could promote features like these at WWDC last year. Generative AI is the biggest thing to happen in the computer industry since previous breakthroughs this century like mobile (starting with the iPhone, followed by Android), social media (Meta), and cloud computing (Microsoft, Google, and Amazon). Nobody knows where it’s going but wherever it’s heading, it’s going to be big, important, and perhaps profitable. Wall Street certainly noticed. And prior to WWDC last year, Apple wasn’t in the game. They needed to pitch their AI story. And a story that involved nothing but table-stakes AI features isn’t nearly as compelling a story as one that involves innovative, breakthrough, ambitious personal features.
But while there’s an obvious appeal to Apple pitching the most compelling, most ambitious AI story possible, the only thing that was essential was telling a story that was true. If the truth was that Apple only had features ready to ship in the coming year that were table stakes compared to the rest of the industry, that’s the story they needed to tell. Put as good a spin on it as possible, but them’s the breaks when you’re late to the game.
The fiasco here is not that Apple is late on AI. It’s also not that they had to announce an embarrassing delay on promised features last week. Those are problems, not fiascos, and problems happen. They’re inevitable. Leaders prove their mettle and create their legacies not by how they deal with successes but by how they deal with — how they acknowledge, understand, adapt, and solve — problems. The fiasco is that Apple pitched a story that wasn’t true, one that some people within the company surely understood wasn’t true, and they set a course based on that.
The Apple of the Jobs exile years — the Sculley / Spindler / Amelio Apple of 1987–1997 — promoted all sorts of amazing concepts that were no more real than the dinosaurs of Jurassic Park, and promised all sorts of hardware and (especially) software that never saw the light of day. Promoting what you hope to be able to someday ship is way easier and more exciting than promoting what you know is actually ready to ship. However close to financial bankruptcy Apple was when Steve Jobs returned as CEO after the NeXT reunification, the company was already completely bankrupt of credibility. Apple today is the most profitable and financially successful company in the history of the world. Everyone notices such success, and the corresponding accumulation of great wealth. Less noticed, but to my mind the more impressive achievement, is that over the last three decades, the company also accumulated an abundant reserve of credibility. When Apple showed a feature, you could bank on that feature being real. When they said something was set to ship in the coming year, it would ship in the coming year. In the worst case, maybe that “year” would have to be stretched to 13 or 14 months. You can stretch the truth and maintain credibility, but you can’t maintain credibility with bullshit. And the “more personalized Siri” features, it turns out, were bullshit.
Keynote by keynote, product by product, feature by feature, year after year after year, Apple went from a company that you couldn’t believe would even remain solvent, to, by far, the most credible company in tech. Apple remains at no risk of financial bankruptcy (and in fact remains the most profitable company in the world). But their credibility is now damaged. Careers will end before Apple might ever return to the level of “if they say it, you can believe it” credibility the company had earned at the start of June 2024.
Damaged is arguably too passive. It was squandered. This didn’t happen to Apple. Decision makers within the company did it.
Who decided these features should go in the WWDC keynote, with a promise they’d arrive in the coming year, when, at the time, they were in such an unfinished state they could not be demoed to the media even in a controlled environment? Three months later, who decided Apple should double down and advertise these features in a TV commercial, and promote them as a selling point of the iPhone 16 lineup — not just any products, but the very crown jewels of the company and the envy of the entire industry — when those features still remained in such an unfinished or perhaps even downright non-functional state that they still could not be demoed to the press? Not just couldn’t be shipped as beta software. Not just couldn’t be used by members of the press in a hands-on experience, but could not even be shown to work by Apple employees on Apple-controlled devices in an Apple-controlled environment? But yet they advertised them in a commercial for the iPhone 16, when it turns out they won’t ship, in the best case scenario, until months after the iPhone 17 lineup is unveiled?
When that whole campaign of commercials appeared, I — along with many other observers — was distracted by the fact that none of the features in Apple Intelligence had yet shipped. It’s highly unusual, and arguably ill-considered, for Apple to advertise any features that haven’t yet shipped. But one of those commercials was not at all like the others. The other commercials featured Apple Intelligence features that were close to shipping. We know today they were close to shipping because they were either in the iOS 18.1 betas already, in September, or would soon appear in developer betas for iOS 18.2 and 18.3. Right now, today, they’ve all actually shipped and are in the hands of iPhone 16 users. But the “Siri, what’s the name of the guy I had a meeting with a couple of months ago at Cafe Grenel?” commercial was entirely based on a feature Apple still has never even demonstrated.
Who said “Sure, let’s promise this” and then “Sure, let’s advertise it”? And who said “Are you crazy, this isn’t ready, this doesn’t work, we can’t promote this now?” And most important, who made the call which side to listen to? Presumably, that person was Tim Cook.
Even with everything Apple overpromised (if not outright lied about) at the WWDC keynote, the initial takeaway from WWDC from the news media was wrongly focused on their partnership with OpenAI. The conventional wisdom coming out of the keynote was that Apple had just announced something called “Apple Intelligence” but it was powered by ChatGPT, when in fact, the story Apple told was that they — Apple — had built an entire system called Apple Intelligence, entirely powered by Apple’s own AI technology, and that it spanned from on-device execution all the way to a new Private Cloud Compute infrastructure they not only owned but are powering with their own custom-designed server hardware based on Apple Silicon chips. And that on top of all that, as a proverbial cherry on top, Apple also was adding an optional integration layer with ChatGPT.
So, yes, given that the news media gave credit for Apple’s own actual announced achievements to OpenAI, Apple surely would have been given even less credit had they not announced the “more personalized Siri” features. It’s easy to imagine someone in the executive ranks arguing “We need to show something that only Apple can do.” But it turns out they announced something Apple couldn’t do. And now they look so out of their depth, so in over their heads, that not only are they years behind the state-of-the-art in AI, but they don’t even know what they can ship or when. Their headline features from nine months ago not only haven’t shipped but still haven’t even been demonstrated, which I, for one, now presume means they can’t be demonstrated because they don’t work.
In May 2011, Fortune published an extraordinary look inside Apple by Adam Lashinsky, at what we now know to be the peak, and (alas) end, of the Steve Jobs era. The piece opens thus:
Apple doesn’t often fail, and when it does, it isn’t a pretty sight at 1 Infinite Loop. In the summer of 2008, when Apple launched the first version of its iPhone that worked on third-generation mobile networks, it also debuted MobileMe, an e-mail system that was supposed to provide the seamless synchronization features that corporate users love about their BlackBerry smartphones. MobileMe was a dud. Users complained about lost e-mails, and syncing was spotty at best. Though reviewers gushed over the new iPhone, they panned the MobileMe service.
Steve Jobs doesn’t tolerate duds. Shortly after the launch event, he summoned the MobileMe team, gathering them in the Town Hall auditorium in Building 4 of Apple’s campus, the venue the company uses for intimate product unveilings for journalists. According to a participant in the meeting, Jobs walked in, clad in his trademark black mock turtleneck and blue jeans, clasped his hands together, and asked a simple question:
“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continued, “So why the fuck doesn’t it do that?”
For the next half-hour Jobs berated the group. “You’ve tarnished Apple’s reputation,” he told them. “You should hate each other for having let each other down.” The public humiliation particularly infuriated Jobs. Walt Mossberg, the influential Wall Street Journal gadget columnist, had panned MobileMe. “Mossberg, our friend, is no longer writing good things about us,” Jobs said. On the spot, Jobs named a new executive to run the group.
Tim Cook should have already held a meeting like that to address and rectify this Siri and Apple Intelligence debacle. If such a meeting hasn’t yet occurred or doesn’t happen soon, then, I fear, that’s all she wrote. The ride is over. When mediocrity, excuses, and bullshit take root, they take over. A culture of excellence, accountability, and integrity cannot abide the acceptance of any of those things, and will quickly collapse upon itself with the acceptance of all three. ★
Image Playground would make a ton of sense not as a consumer-facing app, but as an example project for developers. Long ago, Apple used to share the source code for TextEdit as an example project for Mac developers. (TextEdit is actually a low-key great application, though. It’s genuinely useful, reliable, and understandable.) Apple shares tons of sample code at WWDC each year. Image Playground would be a great sample project. The silly app icon even looks like something from a WWDC sample project. What Image Playground is not is a credible useful generative AI tool. Yet Apple keeps talking about it — and showing it off in new hardware demonstrations — like it’s something they should be proud of and that anyone might credibly use for real-world work or even personal purposes. Image Playground does exemplify just how state-of-the-art the generative AI features are in Apple Intelligence, but not in the way Apple seems to think. ↩︎
Skip to the 53-minute mark of Apple’s September “It’s Glowtime” event introducing the iPhones 16, and it’s Craig Federighi who says the following:
“Siri will be able to tap into your personal context to help you in ways that are unique to you. Like pulling up the recommendation for the TV show that your brother sent you last month. And Siri will gain onscreen awareness. So when your friend texts you about a new album, you’ll be able to simply say, ‘Play that.’ And then you’ll be able to take hundreds of new actions in your apps, like updating a friend’s contact card with his new address, or adding a set of photos to a specific album. With Siri’s personal context understanding and action capabilities, you’ll be able to simply say, ‘Send Erica the photos from Saturday’s barbecue’, and Siri will dig up the photos and send them right off.”
That’s about 40 seconds of keynote time I bet Federighi regrets — and that I suspect he was skeptical about including. It’s telling though, that unlike WWDC, Apple didn’t show those features or spend even a full minute talking about at the iPhone 16 event — despite the fact that, ostensibly, those features should have been three months closer to shipping than they were in June. Federighi’s title is SVP of software, and Apple Intelligence and Siri are “software”, but John Giannandrea (SVP of machine learning and AI strategy) is Federighi’s peer, not subordinate, on the org chart — both report directly to Tim Cook — and is responsible for Siri and Apple Intelligence. Why it was Federighi, not Giannandrea, pitching those features in the iPhone 16 event keynote almost certainly comes down to Federighi’s presentation skills and stage presence, not responsibility for the features themselves. But who’s going on camera to pitch these features and promise their future availability the next time? ↩︎︎
Ben Lovejoy, writing at 9to5Mac:
Our editor-in-chief Chance Miller wryly commented that a radical new look would serve as a great way to distract from the ever-slowing progress on the new Siri. But in truth, I think many more Apple users will be wowed by a new look than would ever care about Siri.
If that’s the way it works out — with a new visual look drawing attention from lackluster progress on the AI front — surely the timing will be coincidental, but some accidents are happy accidents, as Bob Ross used to say.
Sure, the new look introduced with iOS 7 was a massively controversial one, and many thought that then Apple design chief Jony Ive should never have been allowed anywhere near the software side of the business. But love it or hate it, you certainly couldn’t ignore it.
The same will be true of a new 3D look, which might even include some (much more modern) skeuomorphic elements. Probably as many will hate it as love it when it’s first introduced, as that seems to be true of any significant change made by the company, but it will likely make more of a bang than any improvement to Siri the company may introduce, then or later.
There should be no question that all of what Lovejoy is saying here is true. If Apple launches an all-new systemwide UI theme for iOS 19, something even half as radical a change as iOS 7’s theme was, it will be the only thing most users notice or opine about. Humans are fundamentally visual creatures and we notice visual changes. And, most humans are resistant to change. A lot of people simply dislike change, even changes for the better. (Honestly, I think that’s what the iOS 18 Photos complaints are about. Complaints about change itself.)
Part of what makes Apple Apple is that the company is (or at least should be) led by people who both have great taste and trust their own instincts. No one’s taste is perfect. Even Steve Jobs pushed through a few clunkers. But if you have great taste and confidence, you’ll do what your gut says is right.
When a company only pushes out changes that avoid controversy, it leads to paralysis. Then stagnation. Earlier today, Cabel Sasser wrote:
One of my strongest early developer memories was being in the “UI Feedback Forum” at WWDC after they introduced Aqua. Think of a live Q&A, but developers giving notes to a team of Apple engineers.
To these veteran Mac coders, the reaction to Aqua was universally negative. People were actively very angry. It’s a waste! It’s ugly! It’s confusing! How could you. It went on and on, and I was surprised because Aqua looked cool and fun to me.
After that WWDC, they never did another Feedback Forum.
It’s true. What a lot of Mac users wanted was a Mac OS X that looked like Mac OS 9.1 The Aqua look and feel was definitely polarizing. And Apple dialed back its most exuberant details with each subsequent Mac OS X update — less transparency, subtler pinstripes (pinstripes!), etc. But iOS 7 was equally polarizing, and its excesses also got dialed back (or perhaps better, said, dialed back up) with each successive iOS release — a little more depth, some subtle hints of texture.
Either Apple is never going to ship an altogether new UI theme, or they’ll ship one and a large number of people will declare it utter garbage and proof that Apple has completely lost its way. Maybe it will be garbage and proof that Apple has lost its way! Or, maybe it will actually prove to be a great new look that starts a decade-long industry-wide trend that all other companies will soon follow — which is what happened with Aqua’s 3D “lickability”, and happened again with iOS 7’s austere flattening. But either way, you won’t be able to judge it by asking for a show of hands from the general population when it’s unveiled. You either have taste or you don’t, and most people don’t, at least when judging something new and unfamiliar. ★
Which is actually what 1999’s Mac OS X Server 1.0 looked like. It really was the best-looking version of the classic Platinum UI theme Apple ever released. ↩︎