Method 

My thanks to Method Financial for sponsoring last week at Daring Fireball. Method Financial’s authentication technology allows instant access to a consumer’s full liability portfolio using just personal information and consent, eliminating the need for usernames and passwords.

With just a few lines of code, Method’s APIs enable real-time, read-write, and frictionless access to all consumer liability data with integrated payment rails. Method leverages integrations with over 15,000 financial institutions to stream up-to-date, high-fidelity data from users’ accounts and to facilitate payment to them.

Method has helped 3 million consumers connect over 24 million liability accounts at companies like Aven, SoFi, Figure, and Happy Money, saving borrowers millions in interest and providing access to billions of dollars in personalized loans.

‘Meta and Apple: Same Game, Different Rules’ 

Jason Snell, Six Colors:

These are companies playing the same game, but in different ways. Who’s ahead? I would argue that it’s impossible to tell, because if Apple had a product like Orion we would never see it. We can argue about whether Apple’s compulsion to never, ever comment on unannounced products is beneficial or not, but it’s a Steve Jobs-created bit of Apple personality that is very unlikely to be countermanded any time soon.

Here’s how tenuous the Orion prototype is. Meta claims it would cost $10,000, but they haven’t said whether that would be the cost of goods or the retail price. But let’s give them the benefit of the doubt and say that the retail price would be just $10,000 if they brought this to market today. That’s expensive. But it’s not ridiculous. You can buy high-end workstation-class desktops that cost that much. A fully-specced 16-inch MacBook Pro costs $7,200.

But according to The Verge, these Orion prototypes only get 2 hours of battery life. And they’re too thick and chunky. You look weird, if not downright ugly, wearing them. So Meta not only needs to bring the price down by a factor of at least 3× (which would put it around the $3,500 price of Vision Pro, which most critics have positioned as too expensive), they also need to make the glasses smaller — more svelte — while increasing battery life significantly. Those two factors are in direct contradiction with each other. The only easy to way to increase battery life is to put a bigger battery in the device, which makes the device itself thicker and heavier. (See this year’s iPhone 16 Pro models.)

Orion by all accounts is a really compelling demo. But it’s also very clearly a prototype device only suitable for demos. Even at $10,000 retail it wouldn’t be compelling today. Yet somehow Meta wants us to believe they have “line of sight” to a compelling consumer product at a compelling price.

It’s exciting that they showed Orion publicly, but I don’t think it helped Meta in any way going forward. There’s a reason why Apple didn’t show off a prototype iPhone in 2004.

Relay for St. Jude Is Approaching $1 Million 

Every September, the whole extended family at Relay FM raises money for St. Jude Children’s Research Hospital, one of the most amazing institutions in the world. They’re dedicated to curing childhood cancer and helping families affected by it. Since 2019 they’ve raised over $3 million, and their best-ever single month was just north of $775,000.

This year they’re already at $925,000, within earshot of a cool million, with three days to go in the month. Let’s make that happen.

Hidden Pref to Restore Slow-Motion Dock Minimizing on MacOS 

In the midst of recording last week’s episode of The Talk Show with Nilay Patel, I offhandedly mentioned the age-old trick of holding down the Shift key while minimizing a window (clicking the yellow button) to see the genie effect in slow motion. Nilay was like “Wait, what? That’s not working for me...” and we moved on.

What I’d forgotten is that Apple had removed this as default behavior a few years ago (I think in MacOS 10.14 Mojave), but you can restore the feature with this hidden preference, typed in Terminal:

defaults write com.apple.dock slow-motion-allowed -bool YES

Then restart the Dock:

killall Dock

Or, in a single command:

defaults write com.apple.dock slow-motion-allowed -bool YES; killall Dock

I had forgotten that this had become a hidden preference, and that I’d long ago enabled it.

Tom’s Guide iPhone 16 Battery Life Testing Shows Impressive Year-Over-Year Gains 

Tom Pritchard, writing at Tom’s Guide:

We put the iPhone 16, iPhone 16 Plus, iPhone 16 Pro and the iPhone 16 Pro Max through the Tom’s Guide battery test, which involves surfing the web over 5G at 150 nits of screen brightness. The iPhone 16 Pro Max and iPhone 16 Plus have risen to the top with some incredibly impressive results — making our best phone battery life list in the process. Here’s how the new iPhone 16 models’ battery life stacks up against their iPhone 15 counterparts, and rival flagships.

WSJ: ‘Apple Is No Longer in Talks to Join OpenAI Investment Round’ 

Tom Dotan and Berber Jin, reporting late last night for The Wall Street Journal (News+):

Apple is no longer in talks to participate in an OpenAI funding round expected to raise as much as $6.5 billion, an 11th hour end to what would have been a rare investment by the iPhone maker in another major Silicon Valley company. Apple recently fell out of the talks to join the round, which is slated to close next week, according to a knowledgeable person.

I just observed the other day that the tumultuous (to say the least) leadership situation at OpenAI seems incongruous with Apple’s.

Also surely related, to some degree, is this report on OpenAI’s financials that dropped yesterday from Mike Isaac and Erin Griffith at The New York Times:

OpenAI’s monthly revenue hit $300 million in August, up 1,700 percent since the beginning of 2023, and the company expects about $3.7 billion in annual sales this year, according to financial documents reviewed by The New York Times. OpenAI estimates that its revenue will balloon to $11.6 billion next year.

But it expects to lose roughly $5 billion this year after paying for costs related to running its services and other expenses like employee salaries and office rent, according to an analysis by a financial professional who has also reviewed the documents. Those numbers do not include paying out equity-based compensation to employees, among several large expenses not fully explained in the documents.

Rob Pike, on Mastodon:

OpenAI: We lose a little on every sale but we make it up in volume.

iA Writer’s Android App Is Frozen in Carbonite 

iA, which has been shipping a version of iA Writer for Android for 7 years:

By September, we thought we had honored our side of the new agreement. But on the very day we expected to get our access back, Google altered the deal.

We were told that read-only access to Google Drive would suit our writing app better than the desired read/write access. That’s right — read-only for a writing app.

When we pointed out that this was not what we had, or what our users wanted, Google seemed to alter the deal yet again. In order to get our users full access to their Google Drive on their devices, we now needed to pass a yearly CASA (Cloud Application Security Assessment) audit. This requires hiring a third-party vendor like KPMG.

The cost, including all internal hours, amounts to about one to two months of revenue that we would have to pay to one of Google’s corporate amigos. An indie company handing over a month’s worth of revenue to a “Big Four” firm like KPMG for a pretty much meaningless scan. And, of course, this would be a recurring annual expense. More cash for Google’s partners, while small developers like us foot the bill for Android’s deeply ingrained security shortcomings.

Developing serious productivity apps for Android sounds like fun. (See also the footnote on how stunningly rampant piracy is on Android, too.)

X Blocks Links to Stolen J.D. Vance Dossier 

Elizabeth Lopatto, reporting for The Verge:

X is preventing users from posting links to a newsletter containing a hacked document that’s alleged to be the Trump campaign’s research into vice presidential candidate JD Vance. The journalist who wrote the newsletter, Ken Klippenstein, has been suspended from the platform. Searches for posts containing a link to the newsletter turn up nothing.

Posting this just in case there remained an iota of a thought in your head that Elon Musk is actually a radical “free speech” absolutist and not just someone who blew $44 billion buying Twitter to warp the entire platform in the direction of his own weird un-American political agenda.

The Talk Show: ‘The Dynamic Paradox’ 

Nilay Patel returns to the show to consider the iPhones 16.

Sponsored by:

  • Tiptop: A new way to pay that combines Instant Trade-In and Pay-in-4 at checkout.
  • Squarespace: Make your next move. Use code talkshow for 10% off your first order.
CTO Mira Murati Abruptly Leaves OpenAI, Which Is Now Set to Become a For-Profit Company 

Deepa Seetharaman, Berber Jin, and Tom Dotan, reporting for The Wall Street Journal (News+):

OpenAI is planning to convert from a nonprofit organization to a for-profit company at the same time it is undergoing major personnel shifts including the abrupt resignation Wednesday of its chief technology officer, Mira Murati. Becoming a for-profit would be a seismic shift for OpenAI, which was founded in 2015 to develop AI technology “to benefit humanity as a whole, unconstrained by a need to generate financial return,” according to a statement it published when it launched.

I guess I wasn’t paying close enough attention, but I wrongly thought this whole debate over turning OpenAI into a for-profit corporation had been decided a year ago, during the brief saga when the then-board of directors fired Sam Altman for being profit-driven, and then the board itself dissolved and Altman was brought back.

Things started to change in late 2022 when it released ChatGPT, which became an instant hit and sparked global interest in the potential of generative artificial intelligence to reshape business and society. Guided by Chief Executive Sam Altman, OpenAI started releasing new products for consumers and corporate clients and hired a slew of sales, strategy and finance staffers. Employees, including some who had been there from the early days, started to complain that the company was prioritizing shipping products over its original mission to build safe AI systems.

Some left for other companies or launched their own, including rival AI startup Anthropic. The exodus has been particularly pronounced this year. Before Murati, OpenAI’s co-founder and former chief scientist Ilya Sutskever, co-founder and former top researcher John Schulman, and former top researcher Jan Leike all resigned since May. Co-founder and former president Greg Brockman recently took a leave of absence through the end of the year.

In addition to Murati, chief research officer Bob McGrew and head of post-training Barret Zoph also are leaving OpenAI, according to a post on X from Altman.

OpenAI has high-profile partnerships with both Microsoft and Apple, two companies with decades of extraordinarily stable executive leadership. But OpenAI itself seems to be in a state of constant executive disarray and turmoil. That’s a bit of a head-scratcher to me.

LG Smart TVs, Including OLEDs, Now Show Screensaver Ads 

Rasmus Larsen, writing for FlatpanelsHD:

While reviewing LG’s latest high-end G4 OLED TV (review here), FlatpanelsHD discovered that it now shows full-screen screensaver ads. The ad appeared before the conventional screensaver kicks in, as shown below, and was localized to the region the TV was set to.

We saw an ad for LG Channels — the company’s free, ad-supported streaming service — but there can also be full-screen ads from external partners, as shown in the company’s own example below.

Death comes for us all.

Amazon, Google, and Roku have long built their respective TV monetization strategies around ads, and with LG and Samsung turning webOS and Tizen into digital billboards, the only refuge appears to be Apple TV 4K, which can be connected to any TV. You can now disconnect your TV from the internet.

I bought an LG OLED in 2020 that hasn’t been connected to the internet since a few days after we started using it. It’s a great TV.

iMore Closes Down 

Gerald Lynch, editor-in-chief:

Dig out your old iPod and fire up your ‘Songs to cry to’ playlist, I come bearing sad news. After more than 15 years covering everything Apple, it’s with a heavy heart I announce that we will no longer be publishing new content on iMore.

I want to kick off by thanking you all for your support over the many years and incarnations of the site. Whether you were a day-one early adopter in the ‘PhoneDifferent’ days, came on board with ‘The iPhone Blog,’ or recently started reading to find out what the hell Apple Vision Pro is, it’s been a privilege to serve you a daily slice of Apple pie.

So it goes. Nice remembrances from Rene Ritchie (now at YouTube) and Serenity Caldwell (now at Apple).

A Few More Thoughts on Orion and the New Frontier 

Just appended the following to my piece from yesterday on Meta’s Orion AR glasses prototype:

  1. Facebook ships VR headsets and a software platform with an emphasis so strong on “the metaverse” that they rename the company Meta.
  2. Apple announces, and then 7 months later ships, Vision Pro with a two-fold message in comparison: (a) the “metaverse” thing is so stupid they won’t even use the term; (b) overwhelmingly superior resolution and experiential quality. Consumer response, however, is underwhelming.
  3. Meta drops the “metaverse” thing and previews Orion, effectively declaring that they think VR headsets are the wrong thing to build to create the product that defines the next breakthrough step change in personal computing. AR glasses, not VR headsets, are the goal.

It’s a lot of back-and-forth volleying, which is what makes the early years of a new device frontier exciting and highly uncertain. Big bold ideas get tried out, and most of them wind up as dead ends to abandon. Compare and contrast to where we’ve been with laptops for the last 20 years, or the pinnacle we appear to have reached in recent years with phones.


A Few Brief Thoughts on Meta Connect 2024

I’ll link first to The Verge’s “Everything Announced at Meta Connect 2024” roundup because Meta still hasn’t posted today’s keynote address on YouTube; best I’ve found is this recording of the livestream, starting around the 43m:20s mark. I watched most of the keynote live and found it engaging. Just 45 minutes long — partly because it was information dense, and partly because Mark Zuckerberg hosted the entire thing himself. He seems very comfortable, confident, and natural lately. Nothing slows an on-stage keynote down more than a parade of VPs. There was clearly no political infighting at Meta for stage time in this keynote. The keynote was Zuck’s, and because of that, it was punchy and brisk.

In terms of actual products that will actually ship, Meta announced the $300 Quest 3S. That’s more than an entire order of magnitude lower-priced than Vision Pro. Vision Pro might be more than 10× more capable than Quest 3S, but I’m not sure it’s 10× better for just playing games and watching movies, which might be the only things people want to do with headsets at the moment. They also launched a 7,500-unit limited edition of their $430 actually-somewhat-popular Ray-Ban Wayfarer “smart” glasses made with translucent plastic, iMac-style. It’s been a while since someone made a “look at the insides” consumer device. That’s fun, and a little quirky, too.

The big reveal was Orion, a working prototype of see-through AR glasses. Meta themselves are describing them as a “dev kit”, but not only are they not available for purchase, they’re not available, period. They’re internal prototypes for Meta’s own developers, not outside developers. They do seem interesting, for a demo, and I’m hearing from our Dithering correspondent on the scene in Menlo Park that using them is genuinely compelling. There can be no argument that actual glasses are the form factor for AR.

The Verge’s Alex Heath opened his piece on Orion today with this line:

They look almost like a normal pair of glasses.

That’s stretching the meaning of “almost” to a breaking point. I’d say they look vaguely kinda-sorta like a pair of normal glasses. Both the frames (super chunky) and the lenses (thick, prismatic, at times glowing) are conspicuous. They look orthopedic, like glasses intended to aid people whose vision is so low they’re legally blind. It really is true that Meta’s Ray-Ban Wayfarers are nearly indistinguishable from just plain Wayfarers. Orion isn’t like that at all. If you went out in public with these — which you can’t, because they’re internal prototypes — everyone would notice that you’re wearing some sort of tech glasses, or perhaps think you walked out of a movie theater without returning the 3D goggles. But: you could wear them in public if you wanted to, and unlike going out in public wearing a VR headset, you’d just look like a nerd, not a jackass. They’re close to something. But how close to something that would actually matter, especially price-wise, is debatable. From Heath’s report:

As Meta’s executives retell it, the decision to shelve Orion mostly came down to the device’s astronomical cost to build, which is in the ballpark of $10,000 per unit. Most of that cost is due to how difficult and expensive it is to reliably manufacture the silicon carbide lenses. When it started designing Orion, Meta expected the material to become more commonly used across the industry and therefore cheaper, but that didn’t happen.

“You can’t imagine how horrible the yields are,” says Meta CTO Andrew Bosworth of the lenses. Instead, the company pivoted to making about 1,000 pairs of the Orion glasses for internal development and external demos.

Snap recently unveiled their new Spectacles, which they’re leasing, not selling, for $1,200 per year. Snap’s Spectacles are so chunky they make Orion look inconspicuous in comparison. But the race to bring AR glasses to market is clearly on.

See Also: Heath’s interview with Zuckerberg for Decoder.

Next-Day Addendum: I woke up this morning with the following competitive back-and-forth in my head:

  1. Facebook ships VR headsets and a software platform with an emphasis so strong on “the metaverse” that they rename the company Meta.
  2. Apple announces, and then 7 months later ships, Vision Pro with a two-fold message in comparison: (a) the “metaverse” thing is so stupid they won’t even use the term; (b) overwhelmingly superior resolution and experiential quality. Consumer response, however, is underwhelming.
  3. Meta drops the “metaverse” thing and previews Orion, effectively declaring that they think VR headsets are the wrong thing to build to create the product that defines the next breakthrough step change in personal computing. AR glasses, not VR headsets, are the goal.

It’s a lot of back-and-forth volleying, which is what makes the early years of a new device frontier exciting and highly uncertain. Big bold ideas get tried out, and most of them wind up as dead ends to abandon. Compare and contrast to where we’ve been with laptops for the last 20 years, or the pinnacle we appear to have reached in recent years with phones. 


Masimo Founder Joe Kiani Resigns as CEO Following Ouster From Board 

Reuters:

Masimo said on Wednesday founder Joe Kiani has decided to step down as the medical device maker’s CEO, days after shareholders voted to remove him from the company’s board following a bitter proxy battle with activist hedge fund Politan Capital Management.

The company named veteran healthcare executive, Michelle Brennan, as interim chief. Brennan was nominated by Politan for Masimo’s board last year, along with the hedge fund’s founder Quentin Koffey. Both were subsequently elected by shareholders.

Shares of the company were up 5.4% at $133 in early trade. The stock has fallen more than 40% since Feb. 15, 2022, when Masimo announced the $1-billion acquisition of audio products maker Sound United. The deal was a key factor behind Politan’s activism.

No mention of Apple in Reuters’s report, but I wouldn’t be surprised if there’s soon a settlement in the patent dispute over the blood oxygen sensors in recent Apple Watch models that’s left the feature disabled for new watches sold in the U.S. this year. My understanding is that Kiani was single-mindedly obsessed with fighting Apple over this.

Sean ‘Diddy’ Combs and Sam Bankman-Fried Are Cellmates in Arkham Asylum 

Holds hand to earpiece... Correction: they’re bunking in a jail cell together in Brooklyn, not Gotham. We regret the error.


Panels, a New Wallpaper App From Marques Brownlee

A swing-and-a-miss from MKBHD. Criticism of the app is on two separate levels, but they’re being conflated. Level 1: the app is not good. Level 2: a paid wallpaper app? — LOL, wallpapers are free on Reddit. That second form of criticism — that there shouldn’t even exist a paid wallpaper app — is annoying and depressing, and speaks to how many people do not view original art as worth paying for. But it also speaks to the breadth of Brownlee’s audience, which I think is more tech-focused than design-focused. Scott Smith, on Mastodon, observed:

It’s really interesting to compare the reaction from the indie iOS community of @Iconfactory’s Wallaroo to the mainstream tech community’s reaction to @mkbhd’s Panels. I know they are not the same by any means but it sheds light on how many people in mainstream tech circles are still flabbergasted at paying for artwork.

So there’s that, and it is what it is. To some extent that freeloading cheapskate perspective can be ignored. If one’s argument is that all wallpapers ought to be free, that’s not a valid starting point for criticism of a paid wallpaper app/service.

The problem is, Panels is not a good app:

  • It crashed on me during first run on iPhone.

  • The UI is big and bulbous, and while it looks almost the same on iOS and Android (which is probably why it’s so crude), it looks native on neither platform. It looks and feels more like the interface to a game than an app. If anything, it looks and feels more Android-y than iOS-y, if only because “doesn’t really look native anywhere” is more of an Android thing. If Brownlee is down with how this app looks and feels, it explains quite a bit (more) about how he’s willing to spend large stretches of time daily-driving Android phones.

  • Totally subjective, but I don’t think the wallpapers themselves are good. I mean like none of them. They feel like user-generated content, not professional content curated by a trusted tastemaker like Brownlee.

  • The app has a crummy privacy report card, including using your general location for tracking, and on iOS brings up the “Ask App Not to Track” dialog. It’s even worse on Android. Not premium. (Panels doesn’t ask for GPS location access, but it uses your IP address for tracking, which Apple classifies as “location”. Apple ought to clarify that distinction in App Store privacy report cards — asking for GPS is not the same thing at all as IP-based geolocation — but it’s a bad look for the app.)

  • “SD” (1080p) wallpapers are free to download from some creators but require watching a minute or two of video ads. Not premium.

  • Subscribing costs $50/year or $12/month ($144 a year!) — which are, to say the least, premium prices. (Wallaroo is a far better app with — subjectively — far better wallpapers and costs $20/year or $2/month.)

It’s entirely plausible for a premium wallpaper app to justify a price of $50/year. But Panels isn’t a premium app. Premium apps don’t ask to track you across apps. Premium apps don’t make you watch ads to get to their free content. Premium apps look and feel native to their platforms. Premium apps don’t have sketchy privacy report cards. As it stands, Panels is incongruous and incoherent. 


Juli Clover Limited Her iPhone 15 Pro Max to the 80 Percent Charging Limit for an Entire Year 

Juli Clover, MacRumors:

With the iPhone 15 models that came out last year, Apple added an opt-in battery setting that limits maximum charge to 80 percent. The idea is that never charging the iPhone above 80 percent will increase battery longevity, so I kept my iPhone at that 80 percent limit from September 2023 to now, with no cheating.

My iPhone 15 Pro Max battery level is currently at 94 percent with 299 cycles. For a lot of 2024, my battery level stayed above 97 percent, but it started dropping more rapidly over the last couple of months. I left my iPhone at that 80 percent limit and at no point turned the setting off or tweaked it. [...] You can compare your level battery to mine, but here are a couple other metrics from MacRumors staff that also have an iPhone 15 Pro Max and did not have the battery level limited.

  • Current capacity: 87%. Cycles: 329
  • Current capacity: 90%. Cycles: 271

My year-old iPhone 15 Pro (not Max) which I simply used every day and charged to 100 percent overnight: max capacity: 89 percent, 344 charge cycles.

I’m so glad Clover ran this test for a year and reported her results, because it backs up my assumption: for most people there’s no practical point to limiting your iPhone’s charging capacity. All you’re doing is preventing yourself from ever enjoying a 100-percent-capacity battery. Let the device manage its own battery. Apple has put a lot of engineering into making that really smart.

iPhone 16 Models Now Use an Electrically-Released Adhesive 

Donald Papp at Hackaday:

There’s a wild new feature making repair jobs easier (not to mention less messy) and iFixit covers it in their roundup of the iPhone 16’s repairability: electrically-released adhesive.

Here’s how it works. The adhesive looks like a curved strip with what appears to be a thin film of aluminum embedded into it. It’s applied much like any other adhesive strip: peel away the film, and press it between whatever two things it needs to stick. But to release it, that’s where the magic happens. One applies a voltage (a 9V battery will do the job) between the aluminum frame of the phone and a special tab on the battery. In about a minute the battery will come away with no force, and residue-free.

Clever.

Nilay Patel’s iPhone 16 Pro Review Addresses the Nilay-Patel-iest of Questions: What Is a Photo? 

Nilay Patel, writing at The Verge last week:

I asked Apple’s VP of camera software engineering Jon McCormack about Google’s view that the Pixel camera now captures “memories” instead of photos, and he told me that Apple has a strong point of view about what a photograph is — that it’s something that actually happened. It was a long and thoughtful answer, so I’m just going to print the whole thing:

Here’s our view of what a photograph is. The way we like to think of it is that it’s a personal celebration of something that really, actually happened.

Whether that’s a simple thing like a fancy cup of coffee that’s got some cool design on it, all the way through to my kid’s first steps, or my parents’ last breath, It’s something that really happened. It’s something that is a marker in my life, and it’s something that deserves to be celebrated.

And that is why when we think about evolving in the camera, we also rooted it very heavily in tradition. Photography is not a new thing. It’s been around for 198 years. People seem to like it. There’s a lot to learn from that. There’s a lot to rely on from that.

Think about stylization, the first example of stylization that we can find is Roger Fenton in 1854 — that’s 170 years ago. It’s a durable, long-term, lasting thing. We stand proudly on the shoulders of photographic history.

That’s a sharp and clear answer, but I’m curious how Apple contends with the relentless addition of AI editing to the iPhone’s competitors. The company is already taking small steps in that direction: a feature called “Clean Up” will arrive with Apple Intelligence, which will allow you to remove objects from photos like Google’s Magic Eraser.

McCormack’s response is genuinely thoughtful, and resonates deeply with my own personal take. But it’s worth noting that Apple is the conservative company when it comes to generative AI and photography — and yet they’re still shipping Clean Up. I’m not complaining about Clean Up’s existence. I’ve already used it personally. I’m just saying that even Apple’s stance involves significant use of generative AI.

LoveFrom Now Has a Mascot: Montgomery the Bear 

Mark Wilson, writing at Fast Company:

Now, a full five years later, are we meeting the LoveFrom mascot, Montgomery: a bear, paying homage to San Francisco’s Montgomery Street where LoveFrom is headquartered and will soon open its own store.

Montgomery has just appeared on LoveFrom’s website, where it will sniff and follow your cursor, before slowly navigating over the letters of LoveFrom like rocks in a pond.

A lovely mark and even lovelier animation.

Tripp Mickle Profiles Jony Ive and LoveFrom, Confirms OpenAI Partnership for Device 

Tripp Mickle, writing for The New York Times:

Mr. Ive and Mr. Altman met for dinner several more times before agreeing to build a product, with LoveFrom leading the design. They have raised money privately, with Mr. Ive and Emerson Collective, Ms. Powell Jobs’s company, contributing, and could raise up to $1 billion in start-up funding by the end of the year from tech investors.

In February, Mr. Ive found office space for the company. They spent $60 million on a 32,000-square-foot building called the Little Fox Theater that backs up to the LoveFrom courtyard. He has hired about 10 employees, including Tang Tan, who oversaw iPhone product development, and Evans Hankey, who succeeded Mr. Ive in leading design at Apple.

On a Friday morning in late June, Mr. Tan and Ms. Hankey could be seen wheeling chairs between the Little Fox Theater and the nearby LoveFrom studio. The chairs were topped by papers and cardboard boxes with the earliest ideas for a product that uses A.I. to create a computing experience that is less socially disruptive than the iPhone.

The project is being developed in secret. Mr. Newson said that what the product would be and when it would be released were still being determined.

I feel like Mickle somewhat buried the lede here. Architectural projects, magnetic buttons, for $2,000 jackets a lovely new typeface, new steering wheels for electric Ferrari sports cars — all of those design projects are interesting. But an OpenAI-powered personal electronic device, with longtime Apple all-stars Evans Hankey and Tang Tan leading the small team? That’s interesting. That’s competing against Apple. That’s complicated given Ive’s legendary history with Apple. It’s further complicated by the fact that most of LoveFrom’s designers came with Ive from Apple. It’s complicated even further by Powell Jobs’s backing of the startup.

Also somewhat interesting to me is the timing of Mickle’s profile. He spoke with Ive and Marc Newson back in June, but the story was published ... the very day after the arrival of Apple’s new iPhones, AirPods, and watches. That timing might have been entirely the choice of the Times. But still, it’s hard not to notice.

And the whole thing is made even stranger given OpenAI’s partnership with Apple to provide “world knowledge” generative AI by the end of this year. Can’t help but think of then-Google-CEO Eric Schmidt being an Apple board member when the iPhone debuted — with built-in system apps for Google Maps and YouTube — while Google was simultaneously building Android to compete.

iOS 18 Messages Bug, Triggered When Sharing an Apple Watch Face, Causes Crash and Data Loss 

Zac Hall, with a public service message at 9to5Mac:

For now, the bug is triggered when someone replies to a shared watch face in a thread on Messages in iOS 18. The threaded responses feature allows you to have an inline conversation about a specific message that may have been sent earlier in the chat.

If this happens, Messages will repeatedly crash if the user tries to open the conversation in the app. Sending or responding to conversations from other chats directly in Messages is also not easily possible as the app may repeatedly crash.

Once triggered, the bug affects both users. It appears to require responding to the shared watch face from iOS 18. Replying from iOS 18.1 will not trigger the bug.

However, if the user responds in a thread to the shared watch face, Messages will crash on iOS 18.1 beta, iPadOS 18.1 beta, and macOS 15.1 beta as well.

I suspect we’ll see an iOS 18.0.1 update imminently that includes a fix for this. It’s a nasty bug, though.

WorkOS 

My thanks to WorkOS for, once again, sponsoring last week at Daring Fireball. WorkOS is a modern identity platform for B2B SaaS. Start selling to enterprise customers with just a few lines of code. Ship complex features like SSO and SCIM (pronounced skim) provisioning in minutes instead of months.

Today, some of the fastest growing startups are already powered by WorkOS, including Perplexity, Vercel, and Webflow.

For SaaS apps that care deeply about design and user experience, WorkOS is the perfect fit. From high-quality documentation to self-serve onboarding for your customers, it removes all the unnecessary complexity for your engineering team.

Qualcomm Is Trying to Acquire Intel 

Lauren Thomas, Laura Cooper, and Asa Fitch, reporting for The Wall Street Journal (News+):

Chip giant Qualcomm made a takeover approach to rival Intel in recent days, according to people familiar with the matter, in what would be one of the largest and most consequential deals in recent years. A deal for Intel, which has a market value of roughly $90 billion, would come as the chip maker has been suffering through one of the most significant crises in its five-decade history.

A deal is far from certain, the people cautioned. Even if Intel is receptive, a deal of that size is all but certain to attract antitrust scrutiny, though it is also possible it could be seen as an opportunity to strengthen the U.S.’s competitive edge in chips. To get the deal done, Qualcomm could intend to sell assets or parts of Intel to other buyers.

Intel — once the world’s most valuable chip company — had seen its shares drop roughly 60% so far this year before The Wall Street Journal reported on the approach. As recently as 2020, the company had a market value above $290 billion.

If you’d told me even just 10 years ago that one of these two would acquire the other in 2024, I’d have lost my shirt betting it would be Intel acquiring Qualcomm, not the other way around. What an ignominious demise this would be for the company that put the silicon in Silicon Valley. But here we are.

Danny Boyle’s ‘28 Years Later’, a $75 Million Feature Film, Was Shot Using iPhone 15 Pro Max 

Carlton Reid, writing for Wired:

The use of Apple smartphones as the principal camera system on 28 Years Later was subsequently confirmed to Wired by several people connected with the movie, detailing that the particular model used to shoot was the iPhone 15 Pro Max. [...]

Several arthouse films have been shot with iPhones, including Sean Baker’s Tangerine (2015) and the Steven Soderbergh drama Unsane (2018), but these movies were limited-release, low-budget offerings compared to 28 Years Later. The new film’s $75 million budget is only part of the franchise’s total, with 28 Years Later being the first of a new trilogy; all three coming zombie films are being scripted by screenwriter Alex Garland, who is reuniting with Boyle and Mantle after helming Civil War, released earlier this year.

Right there, in your pocket, all day every day.

Austin Mann’s iPhone 16 Pro Camera Review: Kenya 

Austin Mann:

Over the past week we’ve traveled over a thousand kilometers across Kenya, capturing more than 10,000 photos and logging over 3TB of ProRes footage with the new iPhone 16 Pro and iPhone 16 Pro Max cameras. Along the way, we’ve gained valuable insights into these camera systems and their features. [...]

The moment I learned about it, I knew exactly how I wanted to test it: a photo shoot with Craig.

A few months ago, my friend Bobby Neptune introduced me to Craig, one of the last remaining super tusker elephants roaming the earth. He is unique not only because of his enormous tusks, but also because of his extremely gentle demeanor and his curious habit of often approaching safari cruisers. I knew if we were lucky enough to find him, it might be a great opportunity to put the new Ultra Wide sensor to the test.

Bobby got on the phone with David, a local maasai who tracks Craig daily, and we met up with him to see if we could locate this beautiful animal.

I had dinner with Mann and a few other photo-nerd friends in California last week, after the Apple event, and when he told me his plan for Kenya — particularly his hope that they’d be able to find Craig — I crossed my fingers. What a magnificent animal.

Absolutely stunning work — still and video — from Mann, too. Inspiring to know that this same camera is in my pocket every day.

Shohei Ohtani’s Game for the Ages 

Los Angeles Dodgers DH Shohei Ohtani entered last night’s game in Miami with 49 stolen bases and 48 home runs for the season. Only five other players in MLB history have ever hit 40 home runs and stolen 40 bases in a season. No player had ever achieved a 50-50 season.

Ohtani went 6-for-6, hitting 3 homers, stealing 2 bases, and knocking in 10 RBIs — thus breaking 50 in both categories in the same game. Unreal. It’s the second-greatest single-game performance by a hitter in baseball history. (Everyone knows the greatest, especially Dodgers fans.)

Postscript: I want to add my sincere kudos to Marlins manager Skip Schumaker (great baseball name). Asked why he didn’t walk Ohtani intentionally, he replied, “That’s a bad move, baseball-wise, karma-wise, baseball-gods-wise. You go after him and see if you can get him out.” That’s what he said to the media after the game. During the game, in the dugout, when asked if the Marlins should walk Ohtani, he put it even better: “Fuck that.”

Elon Musk, Useful Idiot 

Speaking of “Leon” Musk, here’s Victor Tangermann, writing for The Byte:

In March, the US government accused the founder of Moscow-based company Social Design Agency of orchestrating a “persistent foreign malign influence campaign” on behalf of the Kremlin. The propaganda operation, dubbed “Doppelganger,” involved spreading memes, deepfaked videos, and falsified documents online to alienate the West from Ukraine and its leadership following Russia’s 2022 invasion.

The company is overseen by a top aide to Russian president Vladimir Putin, and has attempted to discredit Ukraine’s military and political leadership by flooding social media with propaganda.

And as it turns out, infamously gullible billionaire and X-formerly-Twitter owner Elon Musk appears to have gotten duped by the campaign. In October 2023, Musk shared a meme to his millions of followers showing Ukrainian president Volodymyr Zelensky straining his face, with the caption reading: “When it’s been 5 minutes and you haven’t asked for a billion dollars in aid.”

According to leaked documents obtained by a number of European media outlets, the meme — alongside a number of other posts that were not shared by Musk — was put together by none other than the Social Design Agency.

Most people might reconsider their social media habits after finding out they inadvertently shared Russian-made propaganda memes.

Cards Against Humanity Sues Elon Musk (SpaceX) for $15 Million 

Cards Against Humanity:

We have terrible news. Seven years ago, 150,000 people paid us $15 to protect a pristine parcel of land on the US-Mexico border from racist billionaire Donald Trump’s very stupid wall.

Unfortunately, an even richer, more racist billionaire — Elon Musk — snuck up on us from behind and completely fucked that land with gravel, tractors, and space garbage.

Just look at it! He fucked it.

How did this happen? Elon Musk’s SpaceX was building some space thing nearby, and he figured he could just dump his shit all over our gorgeous plot of land without asking. After we caught him, SpaceX gave us a 12-hour ultimatum to accept a lowball offer for less than half our land’s value. We said, “Go fuck yourself, Elon Musk. We’ll see you in court.”

European Commission Opens ‘Specification Proceedings’, Ostensibly to Tell Apple Exactly What to Do 

The European Commission yesterday:

Today, the European Commission has started two specification proceedings to assist Apple in complying with its interoperability obligations under the Digital Markets Act (‘DMA’). [...] Pursuant to Article 8(2) of the DMA, the Commission may, on its own initiative, adopt a decision specifying the measures a gatekeeper has to implement to ensure effective compliance with substantive DMA obligations, such as the interoperability obligation of Article 6(7) DMA.

The first proceeding focuses on several iOS connectivity features and functionalities, predominantly used for and by connected devices. [...] The Commission intends to specify how Apple will provide effective interoperability with functionalities such as notifications, device pairing and connectivity.

The second proceeding focuses on the process Apple has set up to address interoperability requests submitted by developers and third parties for iOS and IPadOS [sic]. It is crucial that the request process is transparent, timely, and fair so that all developers have an effective and predictable path to interoperability and are enabled to innovate. [...]

The Commission will conclude the proceedings within 6 months from their opening.

As ever with the Commission and their bureaucratese, I’m unsure whether this announcement is perfunctory or an escalation. But I think it’s an escalation, and they’re so irritated by Apple’s refusal to cave to the “spirit” of the DMA while complying with the letter of the law, that they’re simply going to tell Apple exactly what they want them to do in six months. This is not going to go well, as it seems they’re going to demand Apple offer third-party peripheral makers and software developers the same access to system-level software that Apple’s own first-party peripherals and software have. I’m not even sure why they’re having proceedings, because this press release makes it sounds like they’ve already decided.

Also worth noting: Margrethe Vestager is on her way out, about to be replaced by Spanish socialist Teresa Ribera, a career climate expert (which, possibly, might give her an affinity for Apple, far and away the most climate-friendly large tech company) with no experience in competition law. To me that makes Ribera an odd choice for the competition chief job, but apparently that makes sense in the EU. It remains unclear to me whether Ribera supports Vestager’s crusade against the DMA’s designated “gatekeepers”. If she doesn’t, is this all for naught?

Sidenote: Honest question: Can someone explain to me the Commission’s use of boldfacing? In the first 265 words of the press release, 66 of them are bold, across 13 different spans. They seemingly use boldfacing the way Trump capitalizes words in his tweets: indiscriminately. I find it highly distracting, like trying to read a ransom letter. It’s not just this press release, they do it all the time.

Apple Seeds First Public Betas of iOS 18.1 and macOS 15.1 Sequoia With Apple Intelligence 

Still a little awkward for a tentpole marketing feature of the iPhones 16 arriving tomorrow, but a public beta is a notable milestone.


The iPhones 16

One of the many memorable moments in Steve Jobs’s 2007 introduction of the original iPhone was this slide showing four of the then-leading smartphones on the market. Jobs explained:

Now, why do we need a revolutionary user interface? Here’s four smartphones, right? Motorola Q, the BlackBerry, Palm Treo, Nokia E62 — the usual suspects. And, what’s wrong with their user interfaces? Well, the problem with them is really sort of in the bottom 40 there. It’s this stuff right there. They all have these keyboards that are there whether or not you need them to be there. And they all have these control buttons that are fixed in plastic and are the same for every application. Well, every application wants a slightly different user interface, a slightly optimized set of buttons, just for it.

And what happens if you think of a great idea six months from now? You can’t run around and add a button to these things. They’re already shipped. So what do you do? It doesn’t work because the buttons and the controls can’t change. They can’t change for each application, and they can’t change down the road if you think of another great idea you want to add to this product.

Well, how do you solve this? Hmm. It turns out, we have solved it. We solved it in computers 20 years ago. We solved it with a bitmapped screen that could display anything we want. Put any user interface up. And a pointing device. We solved it with the mouse. We solved this problem. So how are we going to take this to a mobile device? What we’re going to do is get rid of all these buttons and just make a giant screen. A giant screen.

At the time, what seemed most radical was eschewing a hardware QWERTY keyboard and instead implementing a touchscreen keyboard in software. Steve Ballmer, then CEO of Microsoft, in the infamous clip in which he laughed uproariously after being asked for his reaction to seeing the iPhone: “500 dollars? Fully subsidized, with a plan? I said, that is the most expensive phone in the world, and it doesn’t appeal to business customers because it doesn’t have a keyboard, which makes it not a very good email machine.”

Apple didn’t get rid of all the buttons, of course. But the buttons they kept were all for the system, the device, not for any specific application: power, volume, a mute switch (that, oddly, was copied by almost no competitors), and the lone button on the front face: Home.1 That’s it.

When Apple’s competitors stopped laughing at the iPhone and started copying it, they got rid of their hardware keyboards — theretofore the primary signifier differentiating a “smartphone” from a regular phone — but they couldn’t bring themselves to eliminate the not one but two dedicated hardware buttons that, to their unimaginative minds, were inherent to making any cell phone a phone: the green “call” and red “hang up” buttons. Android phones had those red/green buttons. The BlackBerry Storm had them too. Every phone but the iPhone had them. Until they caught up and realized those buttons were obviated too.

The thinking might have been rooted in the very name of the devices. Of course all phones — dumb phones, BlackBerry-style hardware-keyboard phones, iPhone-style touchscreen phones — ought to have phone buttons. I suspect they pondered very deeply how Apple was bold enough to eschew a hardware keyboard for an all-touchscreen design, but that they thought Apple was just taking minimalism to its extreme by eschewing green/red hardware call buttons. No matter how many other things they do, they’re phones first — it’s right there in their name!

But the iPhone has never really been fundamentally a telephone. On the iPhone, the Phone was always just another app. A special app, no question. Default placement in the Dock at the bottom of the Home Screen. Special background privileges within an otherwise highly constrained OS where most apps effectively quit when you’d go back to the Home Screen. Incoming phone calls instantly took over the entire screen. Jobs spent a lot of time in that introduction demonstrating the Phone app — including Visual Voicemail, a genuine breakthrough feature that required AT&T/Cingular’s cooperation on the back end.2

But, still, the Phone part of iPhone was then and remains now just an app. If you compared an iPhone to an iPod Touch, there was nothing on the iPhone hardware that indicated it was any more of a phone than the iPod Touch, which not only wasn’t a phone but didn’t even offer cellular networking. No buttons, for sure. No stick-out antenna. No carrier logo on the device. Look at a modern iPhone and there’s really only one function whose purpose is clearly visible from a conspicuous hardware protrusion: the camera lenses. Five years ago, in the lede of my review of the iPhones 11, I wrote, “A few weeks ago on my podcast, speculating on the tentpole features for this year’s new iPhones, I said that ‘iCamera’ would be a far more apt name than ‘iPhone’.”

What more proof of the camera’s singular importance to the iPhone would one need than the ever-growing block of camera lenses on the back of each year’s new models, or the “Shot on iPhone” ad campaign — the longest-running (and still ongoing) campaign in Apple’s history? A dedicated hardware button?

Camera Control

The facile take is that Apple has run out of hardware ideas and now just adds a new button to the iPhone each year — Action button last year, Camera Control this year, maybe they’ll finally add those green/red phone call buttons next year. But that’s underestimating just how radical it is for Apple, in the iPhone’s 18th annual hardware iteration, to add a hardware button dedicated to a single application.

And I mean application there in the general sense, not just the app sense. By default, of course, pressing Camera Control launches the iOS Camera app,3 but while setting up any new iPhone 16, Apple’s own onboarding screen describes its purpose as launching “a camera app”, with a lowercase c. Any third-party app that adopts new APIs and guidelines can serve as the camera app that gets launched (and, once launched, controlled) by Camera Control. (I’ll default to writing about using the system Camera app, though.)

Apple seemingly doesn’t ever refer to Camera Control as a “button”, but it is a button. You can see it depress, and it clicks even when the device is powered off (unlike, say, the haptic Touch ID Home button on iPhones of yore and the long-in-the-tooth iPhone SE). But it isn’t only a button. You can think of it as two physical controls in one: a miniature haptic slider (like a trackpad with only one axis) and an actually-clicking button.

When the Camera app is not already in shoot mode (whether your iPhone is on the Lock Screen or if another app is active — or even if you’re doing something else inside the Camera app other than shooting, like, say, reviewing existing photos):

  • A full click of the Camera Control button launches the Camera app (if necessary) and puts you in shoot mode.
  • A light press triggers nothing, nor offers any haptic feedback. Light pressing only does something when you’re in the Camera app ready to shoot.

When the Camera app is active and ready to shoot:

  • A full click of the Camera Control button takes a photo or starts a video, depending on your current mode. (If you’re in still-photo mode, clicking-and-holding Camera Control will start a video, just like pressing-and-holding the on-screen shutter button.)
  • A light press on Camera Control opens an overlay that allows you to adjust the current settings mode by sliding your finger left to right, trackpad-style.
  • A double light press on Camera Control changes to the overlay to select which setting to adjust. The options are: Exposure, Depth (ƒ-stop), Zoom, Cameras, Style, Tone.

Just writing that all out makes it sound complicated, and it is a bit complex. (Here’s Apple’s own illustrated guide to using Camera Control.) Cameras are complex. But if you just mash it down, it takes a picture. Camera Control is like a microcosm of the Camera app itself. Just want to point and shoot? Easy. Want to fiddle with ƒ-stops and styles? There’s a thoughtful UI to do that. In the early years of iPhone, Apple’s Camera app was truly point-and-shoot simplistic. The shooting interface had just a few buttons: a shutter, a photo/video toggle, a control for the flash, and a toggle for switching to the front-facing camera. The original iPhone and iPhone 3G didn’t even support video, and the front-facing camera didn’t arrive until the iPhone 4. Those old iPhones had simple camera hardware, and the app reflected that simplicity.

Apple’s modern camera hardware has become remarkably sophisticated, and the Camera app has too. But if you just want to shoot what you see in the viewfinder, it’s as simple as ever. Pinch to zoom, tap to focus, press the shutter button to shoot. But so many other controls and options are there, readily available and intelligently presented for those who want them, easily ignored by those who don’t. Apple’s Camera app is one of the best — and best-designed — pieces of software the world has ever seen. It’s arguably the most-copied interface the world has ever seen, too. You’d be hard-pressed to find a single premium Android phone whose built-in camera app doesn’t look like Apple’s, usually right down to the yellow accent color for text labels.

After over a week using several iPhone 16 review units, my summary of Camera Control is that it takes a while to get used to — I feel like I’m still getting used to it — but it already feels like something I wouldn’t want to do without. It’s a great idea, and a bold one. As I emphasized above, only in the 18th hardware revision has Apple added a hardware control dedicated to a single application. I don’t expect Apple to do it again. I do expect Apple’s rivals to copy Camera Control shamelessly.

At first, though, I was frustrated by the physical placement of Camera Control. As a hobbyist photographer who has been shooting with dedicated cameras all the way back to the late 1990s, my right index finger expects a shutter button to be located near the top right corner. But the center of Camera Control is 2 inches (5 cm) from the corner. I’ll never stop wishing for it to be closer to the corner, but after a week I’ve grown acclimated to its actual placement. And I get it. I’m old enough that I shoot all of my videos and most of my photos in widescreen orientation. But social media today is dominated by tallscreen video. As Apple’s Piyush Pratik explained during last week’s keynote, Camera Control is designed to be used in both wide (landscape) and tall (portrait) orientations. Moving it more toward the corner, where my finger wants it to be, would make it better for shooting widescreen, but would make it downright precarious to hold the iPhone while shooting tall. I hate to admit it but I think Apple got the placement right. Shooting tallscreen is just way too popular. And, after just a week, my index finger is getting more and more accustomed to its placement. It might prove to be a bit of a reach for people with small hands, though.

I’ve also been a bit frustrated by using Camera Control to launch Camera while my iPhone is locked. With the default settings, when your iPhone is unlocked, or locked but with the screen awake, a single click of Camera Control takes you right to shooting mode in the Camera app. That sounds obvious, and it is. But, when your iPhone is locked and the screen is off, or in always-on mode, clicking Camera Control just wakes up the screen. You have to click it again, after the screen is awake, to jump to shooting mode. Apple’s thinking here is obvious: they want to prevent an accidental click of Camera Control while it’s in your pocket or purse from opening Camera. Unlike almost every other mode you can get into on an iPhone, when you’re in shooting mode in Camera, the device won’t go to sleep automatically after a minute or two of inactivity. The current default in iOS 18, in fact, is to auto-lock after just 30 seconds. (Settings → Display & Brightness → Auto-Lock.) In shooting mode, the Camera app will stay open for a long time before going to sleep. You don’t want that to happen inadvertently while your iPhone is in your pocket.

But what I’ve encountered over the last week are situations where my iPhone is in my pocket, and I see something fleeting I want to shoot. This happened repeatedly during a Weezer concert my wife and I attended last Friday. (Great show.) What I want is to click Camera Control while taking the iPhone out of my pocket, and have it ready to shoot by the time I have it in front of my eyes. That’s how the on/off button works on dedicated cameras like my Ricoh GR IIIx. But with an iPhone 16, more often than not, the single click of Camera Control while taking the iPhone out of my pocket has only awakened the screen, not put it into shooting mode. I need to click it again to get into shooting mode. With a fleeting moment, that’s enough to miss the shot you wanted to take. The whole point of this is being a quick-draw gunslinger.

Apple offers a more-protective option in Settings → Camera → Camera Control → Launch Camera to require a double click, rather than single click, to launch your specified camera app. As I write this, I wish that they also offered a less-protective option to always launch your camera app on a single click, even if the phone is locked and the screen is off. A sort of “I’ll take my chances with accidental clicks” option. It’s possible though that Apple tried this, and found that inadvertent clicks are just too common. But as it stands, there’s no great way to always jump into shooting mode as quickly as possible.

When the iPhone is locked and the screen is off, a double click of Camera Control will jump you into shooting mode. I started doing that over the weekend, and at first I thought it satisfied my desire. But the problem with that is that if the iPhone is locked but the screen is already awake, a double click on Camera Control will jump into Camera on the first click and snap a photo with the second. I’ve had to delete at least half a dozen blurry accidental shots because of that.

A gesture that would avoid accidental invocations is clicking-and-holding the Camera Control button. In theory Apple could offer that as a surefire way to launch Camera while taking your iPhone out of your pocket. But Apple has reserved the click-and-hold gesture for visual intelligence, a new Apple Intelligence feature announced last week. That’s the feature that will put the nail in the coffin of dedicated “AI” devices like Humane’s AI Pin and Rabbit’s R1. Visual intelligence isn’t yet available, even in the developer betas of iOS 18.1, but the click-and-hold gesture is already reserved for it.4

So where I’ve landed, at this writing, is trying to remember only to double-click Camera Control while taking my iPhone out of my pocket to shoot, and just sucking it up with the occasional blurry unwanted shot when I double-click Camera Control when the screen is already awake. The only other technique I can think to try is to remember to always wait until I see that the screen is awake before clicking Camera Control, tilting the phone if necessary to wake it, but that would seemingly defeat the purpose of getting into shooting mode as quickly as possible.

By default, if you light-press-and-hold on Camera Control, nearly all of the UI elements disappear from the viewfinder screen. The shooting mode picker (Cinematic, Video, Photo, Portrait, Spatial, etc.), the zoom buttons (0.5×, 1×, 2×, 5×), the front/rear camera toggle, the thumbnail of your most recent photo — all of that disappears from the screen, leaving it about as uncluttered as the original iPhone Camera interface. Think of it as a half-press while using Camera Control as a shutter button. Dedicated hardware cameras have, for many decades, offered two-stage shutter buttons that work similarly. With those dedicated cameras, you press halfway to lock in a focus distance and exposure; then you can move the camera to recompose the frame while keeping the focus distance and exposure locked, before pressing fully to capture the image. Apple has promised to bring this feature to the Camera app for all iPhone 16 models in a software update “later this year”. (It’s not there yet in iOS 18.1 beta 4.) Camera Control does not have quite the same precise feel as a true two-stage shutter button that physically clicks at two separate points of depression (two detents), but it might eventually, in future iPhone models.

One issue with Camera Control is that because it’s capacitive, it’s tricky for case makers. The obvious solution is to just put a cutout around it, letting the user’s finger touch the actual Camera Control button. Apple’s more elegant solution, on their own silicone and clear cases and the new glossy polycarbonate cases from their subsidiary Beats, is “a sapphire crystal, coupled to a conductive layer to communicate the finger movements to the Camera Control”. That doesn’t sound like something you’re going to see in cheap $20 cases. In my testing, both with Apple’s cases and Beats’s, it works fairly seamlessly. I do think you lose some of the feel from the haptic feedback on light presses, though. Ultimately, Camera Control makes it more true than ever before that the best way to use an iPhone is without a case.

One more thing on Camera Control. Of the features that are adjustable via Camera Control (again: Exposure, Depth (ƒ-stop), Zoom, Cameras, Style, Tone), “Cameras” is an easily overlooked standout. Zoom offers continuous decimal increments from 0.5× to 25.0×. That is to say, you can slide your finger to get zoom values like 1.7×, 3.3×, 17.4×, etc. I almost never want that. I want to stick to the precise true optical increments: 0.5×, 1×, 2×, and 5×. That’s what the “Cameras” setting mode offers. Think of it as Zoom, but only with those precise values. (Instead of “Cameras”, this setting could have been called “Lenses”, but that’s potentially confusing because 1× and 2× both come from the same physical lens; the difference is how the sensor data is treated.) In fact, I wish I could go into Settings and disable Zoom from the list of features available in Camera Control. If I ever really want a non-optical zoom level, I can use the existing on-screen interface options.

What’s obvious is that Camera Control clearly was conceived of, designed, and engineered by photography aficionados within Apple who are intimately familiar with how great dedicated cameras work and feel. It surely must have been championed, politically, by the same group. It’s really just rather astounding that there is now a hardware control dedicated to photography on all new iPhones — and a mechanically complex control at that.

Photography, Aside From Camera Control

As usual, I’ll leave it to other reviewers to do in-depth pixel-peeping comparisons of image quality, but suffice it to say, to my eyes, the iPhone 16 Pro (the review unit I’ve been daily driving this past week) camera seems as great as usual.

The big new photographic feature this year has nothing to do with lenses or sensors. It’s a next-generation Photographic Styles, and it’s effectively “RAW for the rest of us”. This has always been the edge of my personal photographic nerdery/enthusiasm. I care enough about photography to have purchased numerous thousand-ish dollar cameras (and lenses) over the decades, but shooting RAW has never stuck for me. I understand what it is, and why it is technically superior to shooting JPEG/HEIC, but it’s just too much work. RAW lets you achieve better results through manual development in post, but you have to develop in post because raw RAW images (sorry) look strikingly flat and unsaturated. For a while I tried shooting RAW + JPEG, where each image you take is stored both as a straight-off-the-sensor RAW file and a goes-through-the-camera-imaging-pipeline JPEG file, but it turned out I never ever went back and developed those RAW images. And relative to JPEG/HEIC (which, henceforth, I’m just going to call “shooting JPEG” for brevity, even though iPhones have defaulted to the more-efficient HEIC format since iOS 11 seven years ago), RAW images take up 10× (or more) storage space.

It’s just too much hassle. The increase in image quality I can eke out developing RAW just isn’t worth the effort it takes — for me. For many serious photographers, it is. Everyone has a line like that. Some people don’t do any editing at all. They never crop, never change exposure in post, never apply filters — they just point and shoot and they’re done. For me, that line is shooting RAW.

Apple first introduced Photographic Styles with the iPhones 13 three years ago, with four self-descriptive primary styles: Rich Contrast (my choice), Vibrant, Warm, and Cool. Each primary style offered customization. Find a style you like, set it as your default, and go about your merry way. But whatever style you chose was how your photos were “developed” by the iPhone hardware imaging pipeline. Apple’s “filters” have been non-destructive for years, but the first generation of Photographic Styles are baked into the HEIC files it writes to storage.

With the iPhone 16 lineup, this feature is now significantly more powerful, while remaining just as convenient and easy to use.5 Apple eliminated what used to be called “filters” and recreated the better ones (e.g. Vibrant and Dramatic) as styles. There are now 15 base styles to choose from, most of them self-descriptively named (Neutral, Gold, Rose Gold), some more poetically named (Cozy, Quiet, Ethereal). The default style is named Standard, and it processes images in a way that looks, well, iPhone-y. The two that have me enamored thus far are Natural and Stark B&W. Standard iPhone image processing has long looked, to many of our eyes, at least slightly over-processed. Too much noise reduction, too much smoothing. A little too punchy. Natural really does look more natural, in a good way, to my eyes. Stark B&W brings to mind classic high-contrast black-and-white films like Kodak Tri-X 400.

A key aspect of Photographic Styles now is that they’re non-destructive. You can change your mind about any of it in post. Set your default to Stark B&W and later on, editing in Photos, you can change your mind and go back to a full-color image using whichever other style you want. There’s a lot of complex image processing going on behind the scenes — both in the iPhone 16 hardware and iOS 18 software — to make this seem like no big deal at all. But because the new Photographic Styles are largely (or entirely?) based on the hardware imaging pipeline, iPhones 13–15 will continue to use the first-generation Photographic Styles, even after upgrading to iOS 18.

I’ve always felt a little guilty about the fact that I’m too lazy to shoot RAW. This next-generation Photographic Styles feature in the iPhone 16 lineup might assuage, I suspect, the remaining vestiges of that guilt.

Design — and the Pro vs. Regular Question

Apple kindly supplied me with all four models in the iPhone 16 lineup for review: the 16 in ultramarine, 16 Plus in pink, 16 Pro in natural titanium, and 16 Pro Max in desert titanium. Ultramarine is my favorite color color on any iPhone in memory. It’s a fun poppy blue, and quite vibrant. Pink is good too, with to my (and my wife’s) eyes, a touch of purple to it. The colors are extra saturated on the camera bumps, which looks great. Natural titanium looks extremely similar, if not identical, to the natural titanium on last year’s iPhone 15 Pro. (Apple’s own Compare page makes it appear as though this year’s natural titanium is noticeably lighter than last year’s, but here’s a photo from me showing a natural 15 Pro Max and 16 Pro side-by-side.) Desert titanium is sort of more gold than tan, but there is some brown to it, without rendering it the least bit Zune-like.

In short, the regular iPhone 16 offers some colors that truly pop. The iPhone 16 Pro models remain, as with all previous “Pro” iPhone colorways, staid shades of gray. White-ish gray, gray gray, near-black gray, and now desert gray.

I always buy black, or the closest to black Apple offers, and this tweet I wrote back in 2009 remains true, so the only year I’ve ever had a “which color to buy?” personal dilemma was 2016 with the iPhones 7, which Apple offered in both a matte “black” and Vader-like glossy “jet black”.6 I still kind of can’t believe Apple offered two utterly different blacks in the same model year.

But “which model to buy?” is sometimes more of a dilemma for yours truly. In 2020 I bought a regular iPhone 12, not the 12 Pro, on the grounds that it weighed less and felt better in hand than the Pro models. Whatever the non-pro iPhone 12 lacked in photographic capabilities wouldn’t matter so much, I correctly guessed, while I remained mostly homebound during the COVID epidemic. But I was also tempted, sorely, by the 12 Mini, and in hindsight I really don’t remember why that’s not the model I bought that year.

It’s a good thing, and a sign of strength for Apple, when the regular iPhone models are extremely appealing even to power users. It seemed like an artificial restriction last year, for example, that only the 15 Pro model got the new Action button. The year prior, only the 14 Pro models got the Dynamic Island; the regular iPhone 14 models were stuck with a no-fun notch. If you’re fairly deep into the weeds regarding TSMC’s first-generation 3nm fabrication, it makes sense why only the iPhone 15 Pro models got a new chip (the A17 Pro — there was no regular A17) while the iPhone 15 models stayed on the year-old A16, but still, that was a bummer too. This year, the regular 16 and 16 Plus not only get the Action button, they get the new Camera Control too (which, as I opined above, would make more sense as a “pro” feature than the Action button did last year), and a new A18 chip fabricated with TSMC’s second-generation 3nm process.

For my own use I’ve preordered an iPhone 16 Pro. But for the first time since the aforementioned iPhone 12 in 2020, I was genuinely tempted by the regular iPhone 16. The biggest functional difference between the 16 and 16 Pro models is that only the 16 Pros have a third telephoto lens. Last year, the 15 Pro Max went to 5×, but the 15 Pro remained at 3×. This year, both the 16 Pro and 16 Pro Max have the 5× telephoto lens. I tend to think I seldom use the telephoto lens, but it turns out I used it a little more in the last year than I would have guessed. Using smart albums in Photos to sort images by camera and lens, it looks like out of 3,890 total photos I shot with my iPhone 15 Pro, the breakdown by camera lens went like this:

CameraOptical ZoomPhotosPercentage
Ultrawide0.5×3389%
Main1×/2×3,07679%
Telephoto47612%

And, eyeballing the photos in that telephoto lens smart album, for most of them, I could have used a little more reach. I don’t expect to use 5× more often than I used 3×, but I expect to get better shots when I do. But it’s also the case that a fair number of the photos in that telephoto smart album are shots I just don’t care about that much. I do use the telephoto lens, and I look forward to having a 5× one instead of 3×, but I could live without it entirely and not miss it too much. (I only have 8 videos shot using 3× from the last year. Longer lenses are not good focal lengths for handheld video.)

Aesthetically, the two-lens arrangement on the back of the iPhones 16 and 16 Plus is far more pleasing than the three-lens triangle-in-a-square arrangement on the iPhones 16 Pro and 16 Pro Max.

Back view of an iPhone 16 and iPhone 16 Pro.

For the last few years (the iPhone 13, 14, and 15 generations), the aesthetic difference in the back camera systems hasn’t been so striking, because Apple placed the non-pro iPhones’ two lenses in a diagonal arrangement inside a square block. The two lenses on the backs of the iPhones 11 and 12 were aligned on the same axis (vertical, when holding the phone in tallscreen orientation), but they were still inside a raised square. You’d have to go back to 2018’s iPhone XS to find a two-lens iPhone with the iPhone 16’s pleasing pill-shaped bump.

Back view of an iPhone 15, iPhone 12, and iPhone XS.

Either you care about such purely aesthetic concerns or you don’t. I care. Not enough to purchase an iPhone 16 instead of a 16 Pro, but it was a factor. The iPhone 16 and 16 Plus simply look more pleasing from the back and feel better in hand, especially caseless, than any iPhone since 2018.

Here’s the pricing for the entire iPhone 16 lineup:

Model128 GB256 GB512 GB1 TB
16$800$900$1,100
16 Plus$900$1,000$1,200
16 Pro$1,000$1,100$1,300$1,500
16 Pro Max$1,200$1,400$1,600

But perhaps a better way to compare is by size class. Regular size:

Model128 GB256 GB512 GB1 TB
16$800$900$1,100
16 Pro$1,000$1,100$1,300$1,500

And big-ass size:

Model128 GB256 GB512 GB1 TB
16 Plus$900$1,000$1,200
16 Pro Max$1,200$1,400$1,600

At both size classes, it’s a $200 delta to go from the regular model to its Pro equivalent. Looking at Apple’s excellent-as-always Compare page, here are the advantages/exclusive features that jump out to me for the 16 Pro models, other than the extra telephoto camera lens, roughly in the order in which I personally care:

  • Always-on display.
  • ProMotion display (adaptive refresh rates up to 120 Hz vs. 60 Hz).
  • An extra GPU core (6 vs. 5), which Geekbench 6 benchmarks as 17 percent faster. Call it 20 percent if you trust core count more than Geekbench.
  • Night mode portrait photos.
  • LiDAR scanner, which I presume is a (or the?) reason why Night mode portrait photos are Pro-exclusive.
  • “Studio-quality four-mic array”. I put that in quotes not to express skepticism but because I haven’t tested it or compared it against the iPhone 16. But it, uh, sounds like a great new feature.
  • USB 3 support vs. USB 2, for “up to 20× faster transfers”.
  • A roughly 4 percent faster CPU in both single- and multi-core performance, according to Geekbench 6.
  • Ability to shoot Dolby Vision video up to 4K at 120 fps.
  • Apple ProRAW photos and ProRes videos (and other pro video features like log video recording and ACES).

I think it’s amazing that the iPhone Pro models are now able to shoot professional-caliber video. But I don’t shoot video professionally. And because I don’t, I can’t remember the last time I needed to transfer data from my iPhone via the USB-C port, so, while the Pro models offer a noticeable advantage in USB performance, I might never use it personally over the next year.

Another difference is that the 16 Pro models have slightly bigger displays than the regular 16 models. The 16 Pro and 16 Pro Max are 6.3 and 6.9 inches; the regular 16 and 16 Plus are 6.1 and 6.7. Whether that’s actually an advantage for the Pro models depends on whether you care that they’re also slightly taller and heavier devices in hand.

Battery Life

I omitted from the above comparison the one spec people care most about: battery life. Here is the sleeper spec where the Pro models earn their keep. Once again grouping like-vs.-like size classes, and including the 15 Pro models for year-over-year comparison:

ModelVideoVideo (streamed)
15 Pro23 hours20 hours
1622 hours18 hours
16 Pro27 hours22 hours
 
15 Pro Max29 hours25 hours
16 Plus27 hours24 hours
16 Pro Max33 hours29 hours

Those battery life numbers come from Apple, not my own testing (and Apple cites them as “up to” numbers). But those numbers suggest 20 percent longer battery life on the 16 Pro models compared to their size-class non-pro counterparts. Anecdotally, that feels true to me. I use a Shortcuts automation to turn on Low Power mode whenever my iPhone battery level drops below 35 percent. With my iPhone 15 Pro, that generally happens every night at some point. Over the last week using the iPhone 16 Pro as my primary iPhone, it hasn’t dropped that low most nights. To say the least, that’s not a rigorous test in any way, shape, or form. But Apple has no history of exaggerating battery life claims, especially relative comparisons between devices. I think it’s the real deal, and the 16 Pro and 16 Pro Max probably get 20 percent longer battery life than their corresponding 16 and 16 Plus counterparts, and between 10–15 percent over last year’s Pro models, in practical day-to-day use.

That alone might be worth a big chunk of the $200 price difference to some people.

Apple Intelligence

I spent the weekdays last week running iOS 18.0; on Friday afternoon, I upgraded my 16 Pro review unit to the developer beta of iOS 18.1 (beta 3 at the time, since upgraded to beta 4). I’m sure many, if not most reviewers, will review only what comes in the box, and what’s coming in the box this week will be iOS 18.0 without any Apple Intelligence features.

That stance is fair enough, but I don’t see it as a big deal to include my 18.1 experience in this review. iOS 18.1 feels pretty close to shipping. Apple has promised “October”, and my gut feeling, using it for the last five days on this review unit, is that it’s pretty solid. I suspect it might ship closer to early October than late October. But even if it doesn’t appear until Halloween, I don’t think it’s absurd or offensive that Apple is already using Apple Intelligence to market the iPhone 16 lineup. It’s a little awkward right now, but it’s not a sham. It’s vaporware until it actually ships, but it’s vaporware that anyone with a developer account can install right now.

Also, none of the Apple Intelligence features currently in iOS 18.1 are game-changing. The Clean Up feature in Photos is pretty good, and when it doesn’t produce good results, you can simply revert to the original. The AI-generated summaries of messages, notifications, and emails in Mail are at times apt, but at others not so much. I haven’t tried the Rewrite tool because I’m, let’s face it, pretty confident in my own writing ability. But, after my own final editing pass, I ran this entire review through the Proofread feature, and it correctly flagged seven mistakes I missed, and an eighth that I had marked, but had forgotten to fix. Most of its suggestions that I have chosen to ignore were, by the book, legitimate. (E.g., it suggested replacing the jargon-y lede with the standard spelling lead. It also flagged my stubborn capitalization of “MacOS”.) It took 1 minute, 45 seconds to complete the proofreading pass of the 7,200+ words in Apple Notes on the iPhone 16 Pro. (Subsequent to the original publication of this review, I tried the Rewrite function on the text of it, for shits and giggles, and the only way I can describe the results is that it gave up.)

New Siri definitely offers a cooler-looking visual interface. And the new Siri voices sound more natural. But it also feels like Siri is speaking too slowly, as though Siri hails from the Midwest or something. (Changing Siri’s speaking rate to 110 percent in Settings → Accessibility → Siri sounds much more natural to my ears, and feels like it matches old Siri’s speaking rate.) Type to Siri is definitely cool, but I don’t see why we couldn’t have had that feature since 2010. I have actually used the new “Product Knowledge” feature, where Siri draws upon knowledge from Apple’s own support documentation, while writing this review. It’s great. But maybe Apple’s support website should have had better search years ago?

These are all good features. But let’s say you never heard of LLMs or ChatGPT. And instead, at WWDC this year, without any overarching “Apple Intelligence” marketing umbrella, Apple had simply announced features like a new cool-looking Siri interface, typing rather than talking to Siri, being able to remove unwanted background objects from photos, a “proofreading” feature for the standard text system that extends and improves the years-old but (IMO) kinda lame grammar-checking feature on MacOS, and brings it to iOS too? Those would seem like totally normal features Apple might add this year. But not tentpole features. These Apple Intelligence features strike me as nothing more than the sort of nice little improvements Apple makes across its OSes every year.

Apple reiterated throughout last week’s “It’s Glowtime” keynote, and now in its advertising for the iPhone 16 lineup, that these are the first iPhones “built for Apple Intelligence from the ground up”. I’m not buying that. These are simply the second generation of iPhone models with enough RAM to run on-device LLMs. LLMs are breakthrough technology. But they’re breakthroughs at the implementation level. The technology is fascinating and important, but so are things like the Swift programming language. I spent the first half of my time testing the iPhone 16 Pro running iOS 18.0 and the second half running 18.1 with Apple Intelligence. A few things got a little nicer. That’s it.

I might be underselling how impossible the Clean Up feature would be without generative AI. I am very likely underselling how valuable the new writing tools might prove to people trying to write in a second language, or who simply aren’t capable of expressing themselves well in their first language. But like I said, they’re all good features. I just don’t see them as combining to form the collective tentpole that Apple is marketing “Apple Intelligence” as. I get it that from Apple’s perspective, engineering-wise, it’s like adding an entire platform to the existing OS. It’s a massive engineering effort and the on-device execution constraints are onerous. But from a user’s perspective, they’re just ... features. When’s the last year Apple has not added cool new features along the scope of these?

Apple’s just riding — and now, through the impressive might of its own advertising and marketing, contributing to — the AI hype wave, and I find that a little eye-roll inducing. It would have been cooler, in an understated breathe-on-your-fingernails-and-polish-them-on-your-shirt kind of way, if Apple had simply added these same new features across their OSes without the marketing emphasis being on the “Apple Intelligence” umbrella. If not for the AI hype wave the industry is currently caught in, this emphasis on which features are part of “Apple Intelligence” would seem as strange as Apple emphasizing, in advertisements, which apps are now built using SwiftUI.

If the iPhone 16 lineup was “built from the ground up” with a purpose in mind, it’s to serve as the best prosumer cameras ever made. Not to create cartoon images of a dog blowing out candles on a birthday cake. The new lineup of iPhones 16 are amazing devices. The non-pro iPhone 16 and 16 Plus arguably offer the best value-per-dollar of any iPhones Apple has ever made. This emphasis on Apple Intelligence distracts from that.

The problem isn’t that Apple is marketing Apple Intelligence a few weeks before it’s actually going to ship. It’s that few of these features are among the coolest or most interesting things about the new iPhone 16 lineup, and none are unique advantages that only Apple has the ability or inclination to offer.7 Every phone on the market will soon be able to generate impersonal saccharine passages of text and uncanny-valley images via LLMs. Only Apple has the talent and passion to create something as innovative and genuinely useful as Camera Control. 


  1. While I’m reminiscing, allow me to reiterate my belief that the icon on the iPhone Home button is the single greatest icon ever designed. In my 2017 review of the iPhone X, I wrote:

    The fundamental premise of iOS Classic is that a running app gets the entire display, and the Home button is how you interact with the system to get out of the current app and into another. Before Touch ID, the Home button was even labeled with a generic empty “app” icon, an iconographic touch of brilliance. [...]

    I find it hard to consider a world where that button was marked by an icon that looked like a house (the overwhelmingly common choice for a “home” icon) or printed with the word “HOME” (the way iPods had a “MENU” button). Early iPhone prototypes did, in fact, have a “MENU” label on the button.

    I truly consider the iPhone Home button icon the single best icon ever. It perfectly represented anything and everything apps could be — it was iconic in every sense of the word.

     ↩︎

  2. It’s almost unfathomable how much of a pain in the ass voicemail was before the iPhone. Rather than manage messages on screen, you placed a phone call to your carrier and interfaced with their system by punching number buttons. You had to deal with each message sequentially. “Press 1 to play, 2 to go to the next message, 7 to delete.” And you had to actually listen to the messages to know who they were from. It was horrible. ↩︎︎

  3. Unless, I suppose, you live in the EU and have exercised your hard-earned right to delete it↩︎︎

  4. That’s the only way to launch visual intelligence, which means the feature is exclusive to the iPhone 16 lineup and won’t be available on iPhone 15 Pros. I’m truly looking forward to this feature, so that’s a bummer for iPhone 15 Pro owners. ↩︎︎

  5. Here’s Apple’s brief documentation for the old Photographic Styles feature (iPhones 13, 14, 15) and the new version (iPhones 16). ↩︎︎

  6. Jet black aluminum is back, and as Vader-esque as it was on the iPhone 7 in 2016, with a new colorway for the Apple Watch Series 10 this year. I have a review unit in jet black on my wrist and it’s so great. ↩︎︎

  7. It’s fair to argue that Private Cloud Compute is uniquely Apple. Not that Apple is the only company that could build out such an infrastructure for guaranteed-private off-device AI processing, but among the few companies that could do it, Apple is the only one that cares so deeply about privacy that they would. I do not expect Private Cloud Compute to be replicated by Google, Samsung, Meta, Amazon, or Microsoft. Nor any of the AI startups like OpenAI or Anthropic. They simply don’t care enough to do it the hard way. Apple does. But that belongs in the marketing for Apple’s ongoing Privacy campaign, not for the iPhones 16 in particular. ↩︎︎


Apple Intelligence Will Come to More Languages, Including German and Italian, Next Year (But Don’t Hold Your Breath for iPhones and iPads) 

Allison Johnson, The Verge:

Apple Intelligence’s list of forthcoming supported languages just got a little longer. After an October launch in US English, Apple says its AI feature set will be available in German, Italian, Korean, Portuguese, Vietnamese, “and others” in the coming year. The company drops this news just days before the iPhone 16’s arrival — the phone built for AI that won’t have any AI features at launch.

Apple’s AI feature set will expand to include localized English in the UK, Canada, Australia, South Africa, and New Zealand in December, with India and Singapore joining the mix next year. The company already announced plans to support Chinese, French, Japanese, and Spanish next year as well.

Apple shared this news with me last night too, and my first thought was, “German and Italian? Does that mean they’ve gotten the OK that Apple Intelligence is, in fact, compliant with the DMA?” But that’s not what they’re announcing. This is just for Apple Intelligence on the Mac — which already offers Apple Intelligence in the EU in MacOS 15.1 Sequoia betas, because the Mac is not a designated “gatekeeping” platform. The standoff over Apple Intelligence on iOS and iPadOS remains.

Voyager 1 Just Fired Up Thrusters It Hasn’t Used in Decades 

Ashley Strickland, reporting for CNN:

Engineers at NASA have successfully fired up a set of thrusters Voyager 1 hasn’t used in decades to solve an issue that could keep the 47-year-old spacecraft from communicating with Earth from billions of miles away. [...]

As a result of its exceptionally long-lived mission, Voyager 1 experiences issues as its parts age in the frigid outer reaches beyond our solar system. When an issue crops up, engineers at NASA’s Jet Propulsion Laboratory in Pasadena, California, have to get creative while still being careful of how the spacecraft will react to any changes.

Currently the farthest spacecraft from Earth, Voyager 1 is about 15 billion miles (24 billion kilometers) away. The probe operates beyond the heliosphere — the sun’s bubble of magnetic fields and particles that extends well beyond Pluto’s orbit — where its instruments directly sample interstellar space.

Michael Chabon, on Threads:

I find the continuing mission of Voyager 1 so moving, for the way its name alone evokes a time of promise, for the thought of that tiny contraption way out there in the vastness at the edge of the heliosphere — perhaps the farthest any human-made thing may ever travel — a bit battered, swiftly aging, still doing the work it was purposed to do.

An amazing feat of engineering five decades ago, kept going by amazing feats of engineering today.

Israel Planted Explosives in Pagers Sold to Hezbollah, Officials Say 

Sheera Frenkel and Ronen Bergman, reporting for The New York Times:

Israel carried out its operation against Hezbollah on Tuesday by hiding explosive material within a new batch of Taiwanese-made pagers imported into Lebanon, according to American and other officials briefed on the operation.

The pagers, which Hezbollah had ordered from Gold Apollo in Taiwan, had been tampered with before they reached Lebanon, according to some of the officials. Most were the company’s AP924 model, though three other Gold Apollo models were also included in the shipment.

The explosive material, as little as one to two ounces, was implanted next to the battery in each pager, two of the officials said. A switch was also embedded that could be triggered remotely to detonate the explosives.

At 3:30 p.m. in Lebanon, the pagers received a message that appeared as though it was coming from Hezbollah’s leadership, two of the officials said. Instead, the message activated the explosives. Lebanon’s health minister told state media at least nine people were killed and more than 2,800 injured.

The devices were programmed to beep for several seconds before exploding, according to three of the officials.

Hezbollah leadership had ordered its members to forgo modern phones for security reasons, convinced (probably correctly) that Israeli intelligence was able to track them. So they switched to decades-old pagers. But Israel seemingly infiltrated the supply chain of Gold Apollo and boobytrapped the pagers.

In the initial pandemonium after the attack was triggered, there was speculation that, somehow, it was simply the batteries that exploded. But batteries — especially the AAA batteries these pagers use — don’t explode with that much force:

Independent cybersecurity experts who have studied footage of the attacks said it was clear that the strength and speed of the explosions were caused by a type of explosive material.

“These pagers were likely modified in some way to cause these types of explosions — the size and strength of the explosion indicates it was not just the battery,” said Mikko Hypponen, a research specialist at the software company WithSecure and a cybercrime adviser to Europol.

This whole operation sounds like it would make for a great movie.

(Hypponen, whom I believe I met, at least once, at a long-ago Macworld Expo or WWDC, was previously referenced on DF in 2012 regarding a widespread Mac Trojan horse.)

Ten Years of Six Colors 

Jason Snell:

Ten years sure seems like a long time.

Ten years ago the iPhone got physically big for the first time. (In the ensuing decade, iPhone revenue has doubled.) Ten years ago Apple announced the Apple Watch.

Ten years ago I found myself without a job for the first time.

I ran into Snell before (and again after) Apple’s event last week, and he mentioned that it marked Six Colors’s 10th anniversary. My reaction: I somehow simultaneously think of Six Colors as still kinda new and a bedrock, irreplaceable part of the Apple media firmament.

On days like today, it’s the first site I visit, because of pieces like these:

And, nearest and dearest to my heart, Snell’s review of MacOS 15 Sequoia. All of that, just today.

Apple Watch’s Sleep Apnea Detection Feature Now Available in More Than 150 Countries 

Joe Rossignol, reporting for MacRumors:

Apple released watchOS 11 today following months of beta testing. A key new health-related feature included in the software update is sleep apnea detection, which is available starting today on the Apple Watch Series 10, Apple Watch Series 9, and Apple Watch Ultra 2 in more than 150 countries and regions, according to Apple.

The list of countries includes the U.S., U.K., France, Germany, Italy, Spain, Japan, New Zealand, Singapore, and many others, with a full list available on Apple’s website. A few English-speaking countries where the feature is not yet available are Australia and Canada, as Apple is still seeking regulatory clearance for the feature in some regions.

Apple has also published the clinical validation summary for the sleep apnea notification feature.


The Things They Carried

Thoughts and Observations in the Wake of Apple’s ‘It’s Glowtime’ Keynote

Back in July 2007, I contributed this photo to a Flickr group called, self-describingly, “The Items We Carry”:

A top-down view of the following items, arranged neatly on a wooden tabletop: a small Leatherman pocketknife/multi-tool, a keyring with four keys, a small leather wallet, a Ricoh GR-D digital camera, cheap black aviator sunglasses, a 0.5 mm Pilot G2 pen, a black pocket-sized hardcover Moleskine notebook, a black Swiss Army wristwatch, and an original iPhone.

17 years later, I’ve consolidated. 2007 was so long ago that Field Notes hadn’t yet been created; my back-pocket notebooks are much slimmer now than that hardcover Moleskine.1 Instead of the Leatherman and a ring of four keys, I’m down to just two keys and a Halifax key-sized tool from The James Brand. The keys and Halifax lay flat and fit comfortably in the change pocket of a pair of jeans.2 And while I do own (and very much enjoy) a Ricoh GR IIIx — a modern but very similarly sized descendant of the above camera — I don’t carry it on an everyday basis because iPhone cameras have gotten so good.

But there was an everyday carry item I omitted from the above photo — wired earbuds. I don’t remember if I omitted them because I forgot to include them, or if it was for aesthetic reasons. But circa 2007 I generally had a pair of wired Apple earbuds with me. Today, I’ve always got my AirPods Pro — far and away my favorite consolidation. By weight, AirPods (with case) are heavier, but measured by fussiness, they offer a never-tangled fraction of the wired earbuds experience.

The remaining items in my pockets are equivalent but upgraded. I’ve got better (and because I no longer wear contact lenses, prescription) sunglasses. I long ago switched from the Pilot G2 to the Zebra Sarasa, a pen I consider nearly perfect. The above Swiss Army watch broke in early 2009, and I’ve since started a small (I swear) collection of mechanical watches. But on most workdays, there’s an Apple Watch on my wrist.

iPhone, AirPods, Apple Watch. I’ve almost always got two of them with me, and often all three. I’m sure for many of you reading this, it’s always all three.

These are Apple’s three “everyday carry” products. They’re as much a part of our lives as our clothes and glasses and jewelry. They thus go together swimmingly in the same product introduction event.

Last week’s “It’s Glowtime” event was very strong for Apple. It might have been the single strongest iPhone event since the introduction of the iPhone X. All three platforms are now in excellent, appealing, and coherent shape:

  • iPhone: Last year was a bit “meh” for the non-pro iPhone 15 models, which were stuck with year-old A16 silicon and didn’t get the new hardware Action button. This year, the iPhone 16 and 16 Plus get the brand-new A18 chip, the Action button, and the new Camera Control button. That Camera Control button would have made total sense as a Pro-exclusive feature this year (more sense, to me, than making the Action button Pro-exclusive last year) but all new iPhones have it. This year’s iPhone 16 and 16 Plus even come in the best and most fun colors in a few years. The iPhone 16 Pro and 16 Pro Max have best-in-class camera systems, the A18 Pro chip (6 GPUs vs. 5 in the regular A18, bigger CPU caches, faster memory and storage I/O), and slightly bigger screens.3 The iPhone remains the best product in the most important and profitable device category the world has ever seen.

  • Apple Watch: The new Series 10 models sport bigger displays, longer battery life, and 10 percent reductions in thickness and weight. The new watch displays are slightly bigger, and, more importantly, also have noticeably wider viewing angles, and in always-on mode update once per second instead of once per minute. There’s an absolutely gorgeous polished jet black option in aluminum — marking, to my taste, the first time in the entire history of Apple Watch that there’s a base-priced aluminum model that stands toe-to-toe with the more expensive models on aesthetic grounds. But those more expensive models are better than ever too, with Apple replacing polished stainless steel with much lighter polished titanium. I was unaware that titanium could be polished to a mirror-like sheen. Apple had previously made Series models of Apple Watch in titanium — I bought one in Series 5 and again in Series 7 — but those were at the Edition tier, priced above the stainless steel models. There was no Ultra 3 announcement — the lone sour note in the Watch segment — but the Ultra 2 is now available in an excellent and sure-to-be-popular satin black DLC coating.

  • AirPods: No more selling years-old models at entry-level prices. The new good/better/best lineup is clear. Good: new AirPods 4 for $129. Better: AirPods 4 with Active Noise Cancellation (ANC) and a better case (with wireless charging and Find My support, including a speaker) for $179. Best: the existing $249 AirPods Pro, which, yes, debuted two years ago, but which continue to be updated with improved functionality via firmware updates — most notably and importantly now, certified support for use as hearing aids. There’s also the revised AirPods Max, with new colors and USB-C replacing Lightning, but alas, no significant internal updates like the H2 chip now available in all other AirPod models, which powers features like voice isolation and nod/shake-your-head Siri interactions or lossless audio support. That’s a slightly sour note, but on the whole, the AirPods lineup looks (and sounds) better than ever.


But, still, flying home from California on Tuesday, I was left with a feeling best described as ennui. On Threads, I summarized my feelings with one short sentence:

The obvious truth is that we all, including Apple, miss Steve Jobs.

Quoting me, Walt Mossberg responded:

This👇. Absolutely true. The post-Steve Jobs Apple has been a phenomenal money-making machine and has had a few notable product successes like AirPods and the Apple Watch. But it hasn’t replicated the big game changer product experience of the Jobs era.

I’ve been pondering this for the remainder of the week. One factor is that the iPhone defined the apex of personal computing. In the early years of PCs, everyone knew we wanted portability. Most of us — including me — thought we reached that with laptops. But laptops don’t go with us everywhere, and, it turns out, we want computers that go with us everywhere. That’s the iPhone, and the original iPhone in 2007 established the all-touch-screen form factor and general concept right out of the gate. That first iPhone blew our minds the moment Steve Jobs showed it to us.

Eventually, some company will introduce such a product again. It might be Apple. If it happens any time soon or soon-ish, it probably will be Apple. Apple, as a company, has a long-term strategy that it hopes will make it as likely as possible that it will be Apple. But there’s been no such product since the iPhone and, in my opinion, there is no technology extant today that would enable such a product. I feel confident that if Steve Jobs were alive and still leading Apple product development, there would have been no iPhone-like mind-blown-the-moment-you-first-saw-it new product in the intervening years.

Mossberg correctly cites AirPods and Apple Watches as big successes of the post-Jobs era. Not coincidentally, they are two of the three platforms Apple featured in last week’s event — and two of the three that people carry wherever they go. But we don’t really carry AirPods and Apple Watch — we wear them. They’re not more important than our iPhones, but they are more intimate, more personal.

But they’re also more subtle. When AirPods debuted many people thought they looked weird to wear. When Apple Watch debuted people were underwhelmed. But it turns out, perhaps even more so than with the iPhone, Apple nailed the design with both products right out of the gate. Apple Watch has an iconic, instantly recognizable form factor, but from a distance of more than a few feet away, it’d be hard to tell a brand-new Series 10 from 2015’s original “Series 0” models. The original watch straps remain compatible with the newest models — including even the Ultras. With AirPods, the stems have gotten noticeably shorter, but on the whole, they’re the exact same basic concept as the original models from 2016. They’re even offered in the exact same array of colors: white, white, or white.

What we’re seeing is Tim Cook’s Apple. Cook is a strong, sage leader, and the proof is that the entire company is now ever more in his image. That’s inevitable. It’s also not at all to say Apple is worse off. In some ways it is, but in others, Apple is far better. I can’t prove any of this, of course, but my gut says that a Steve-Jobs–led Apple today would be noticeably less financially successful and industry-dominating than the actual Tim-Cook–led Apple has been.

I think Apple Watch, under Jobs, would have been more like iPod was or AirPods have been: a product entirely defined by Apple, not a platform for third-party developers. (Jobs was famously reluctant to even make iPhone a platform.)

But the biggest difference is that Apple, under Jobs, was quirky, and I think would have remained noticeably more quirky than it has been under Cook. You’d be wrong, I say, to argue that Cook has drained the fun out of Apple. But I do think he’s eliminated quirkiness. Cook’s Apple takes too few risks. Jobs’s Apple took too many risks.

Tim Cook is no mere bean counter. Far from it. He shares with Jobs a driving ambition to change the world. Cook, just like Jobs, surely relates deeply to this quip from Walt Disney: “We don’t make movies to make money. We make money to make more movies.” But the ways Cook is driven to change the world are different.

Jobs often wore his heart on his shirtsleeve, not merely in public but on stage. Remember the 2010 WWDC keynote, when Jobs had an iPhone demo fail because the Wi-Fi wasn’t working, and 20 minutes later he came back on stage and angrily demanded that the media turn off their portable Wi-Fi base stations being used for live-blogging the keynote? He was fucking angry and he let us know it. You simply didn’t mess with a Steve Jobs keynote.

Cook almost never reveals his true passionate self in public. But at least one time he did. At the 2014 annual shareholders meeting, Cook faced a question from a representative of the right-wing National Center for Public Policy Research (NCPPR). As reported by Bryan Chaffin at The Mac Observer:

During the question and answer session, however, the NCPPR representative asked Mr. Cook two questions, both of which were in line with the principles espoused in the group’s proposal. The first question challenged an assertion from Mr. Cook that Apple’s sustainability programs and goals — Apple plans on having 100 percent of its power come from green sources — are good for the bottom line.

The representative asked Mr. Cook if that was the case only because of government subsidies on green energy. Mr. Cook didn’t directly answer that question, but instead focused on the second question: the NCPPR representative asked Mr. Cook to commit right then and there to doing only those things that were profitable.

What ensued was the only time I can recall seeing Tim Cook angry, and he categorically rejected the worldview behind the NCPPR’s advocacy. He said that there are many things Apple does because they are right and just, and that a return on investment (ROI) was not the primary consideration on such issues.

“When we work on making our devices accessible by the blind,” he said, “I don’t consider the bloody ROI.” He said the same thing about environmental issues, worker safety, and other areas where Apple is a leader.

As evidenced by the use of “bloody” in his response — the closest thing to public profanity I’ve ever seen from Mr. Cook — it was clear that he was quite angry. His body language changed, his face contracted, and he spoke in rapid fire sentences compared to the usual metered and controlled way he speaks.

He didn’t stop there, however, as he looked directly at the NCPPR representative and said, “If you want me to do things only for ROI reasons, you should get out of this stock.”

A philosophy of “Whatever is best for the ROI” is McKinsey-flavored cowardice, not leadership. Sometimes a leader needs to make decisions with uncertain business sense, or even knowing that the decision doesn’t make business sense, but simply because their intuition or conscience tell them it’s the right thing to do. Insofar as he’s willing to make such decisions, Cook is like Jobs. But what sort of decisions those are, are very different.

Jobs was driven to improve the way computers work. Cook is driven to improve the way humans live. Accessibility and the environment are much higher priorities under Cook than they were under Jobs. Apple’s entire foray into Health has occurred under Cook’s leadership — and Health-related features were tentpole features in last week’s keynote. I wouldn’t be surprised if it cost far more money to get AirPods Pro certified as medical-grade hearing aids than Apple will make back in profits from an increase in sales. The Apple 2030 initiative to bring the company’s entire carbon footprint to net zero emissions is fundamentally about doing the right thing, not just selling more products.

Jobs emphasized making more interesting products, and maximizing surprise and delight upon their unveilings. Tim Cook wouldn’t have sent the police after Gizmodo’s purloined iPhone 4 prototype; Steve Jobs probably thought Apple took it easy on them.

Cook values predictability. But predictability is in conflict with quirkiness. You don’t need to even recognize Mark Gurman’s name to have predicted that last week’s event would focus on new iPhones, new AirPods, and new Apple Watches. Nor do you need to follow the rumor mill to have correctly guessed that all of those new products would look very similar to the models that preceded them. Evolve, evolve, evolve. There’s no resting on laurels inside Apple with any of those products. The iPhones 16, Apple Watch Series 10, and AirPods 4 are all the result of intensely-focused industry-leading engineering (including in fields like material engineering) and design. But they also all basically look exactly like all of us expected them to look.

Remember the 3rd-gen iPod Nano in 2007? A.k.a. “the fat Nano”:

Marketing photo of five 3rd-gen “fat” iPod Nanos, blue, red, silver, green, and black.

It’s quite possible you don’t remember the fat Nano, because it wasn’t insanely great, even though it replaced 2nd-gen iPod Nano models that were. And so a year later, with the 4th-gen Nanos, Apple went back to the tall-and-skinny design, as though the fat Nano had never happened.

That fat Nano was quirky. It was also, in hindsight, obviously a mistake. I’m quite sure that inside Apple there were designers and product people who thought it was a mistake before it shipped. Steve Jobs shipped it anyway, surely because his gut told him it was the right thing to try. Tim Cook’s Apple doesn’t make mistakes like that. That’s ultimately why Cook’s Apple is more successful — with more customers, more revenue, higher margins (and thus more profit), and more, well, sheer dominance (and, thus, more regulatory scrutiny). But it’s also why Cook’s Apple delivers fewer surprises. The delight is still there, but there’s less amazement. It’s by design. They’re not trying but failing to reach the heights of the Jobs era’s ecstatic design novelty, because those peaks had accompanying valleys. Apple today is aiming for, and achieving, utterly consistent excellence. Quirkiness no longer fits.

The lampshade iMac G4. The G4 Cube. Peripherals like the iSight camera — a product that was never intended to sell in massive numbers, but belongs in a museum alongside the best designs from Dieter Rams’s Braun. Flower Power and Dalmatian iMacs. Under Jobs, Apple made many products that were memorable at each, or at least every other, revised generation. But sometimes — like the fat Nano — they were memorable for being duds. Under Cook, Apple produces instantly iconic designs that tend only to evolve — in mostly predictable ways, on mostly predictable schedules.4

Tim Cook’s Apple has not missed any fundamental new technology or shift that would have resulted in industry-changing new products or platforms that, if he’d lived, Steve Jobs would have led Apple to create through his singular genius or sheer force of will. But I also don’t think we’d have been left with new iPhones 16 that look like last year’s iPhones 15, which looked like the iPhones 14, which looked like the iPhones 13, which looked like the iPhones 12, which looked like the iPhones 11 except that year the side rails went from rounded back to flat, like the iPhone 4 and 5 models. I can show you an iPhone 4 and you’ll know, instantly, that it is either an iPhone 4 or 4S. I can show you an iPhone 5 and you’ll know it’s either an iPhone 5 or 5S (or the first iPhone SE). If it were a black-and-slate iPhone 5 you might even know it’s a 5 specifically, not a 5S, because the anodization of the black iPhone 5 wore off over time, producing a weathered look — a patina — that I found endearing in an evocative way no subsequent iPhone has. It was an imperfection, but sometimes imperfections are what we love most about products.5 Show me an iPhone from 2020–2024 a decade from now and I’ll have to remember exactly when the Action button and Camera Control appeared to know which was which.

Cook has patience where Jobs would grow restless. In the Jobs era, when a keynote ended, we’d sometimes turn to each other and say, “Can you believe ____?” No one asked that after last week’s keynote. Much of what Apple announced was impressive. Very little was disappointing. Nothing was hard to believe or surprising.

This isn’t bad for Apple, or a sign of institutional decline. If anything, under Cook, Apple more consistently achieves near-perfection. Tolerances are tighter. Ship dates seldom slip. But it’s a change that makes the company less fun to keenly observe and obsess over. Cook’s Apple is not overly cautious, but it’s never reckless. Jobs’s Apple was occasionally reckless, for better and worse.

My dissatisfaction flying home from last week’s event is, ultimately, selfish. I miss having my mind blown. I miss being utterly surprised. I miss occasionally being disappointed by a product design that stretched quirky all the way to wacky. I miss being amazed by something entirely unexpected out of left field. Poor me, stuck only with the announcement of noticeably improved versions of three products — three product families — that zillions of people around the world, myself included, carry with us wherever we go. 


  1. Moleskine has long made thin, vaguely Field-Notes-y “Cahier” notebooks, but they suck. After just a few days in my back pocket, pages would start falling out. Field Notes aren’t just fun; they are meant to be used and abused↩︎︎

  2. Also known as the iPod Nano pocket↩︎︎

  3. The overall devices are now, once again, slightly larger with the 16 Pro and Pro Max, too — which is not a positive for fans of smaller iPhones. iPhone 12/13 Mini holdouts, this footnote is for you. ↩︎︎

  4. I also believe that Apple would have gone back to live-on-stage keynotes, post-COVID, if Jobs were still around. The differences between those two keynote formats — live stage events vs. pre-filmed movies — are a good proxy for the changes to Apple as a whole. Apple’s modern pre-filmed keynotes are better for the company and better for most people who want to watch them. (Viewership numbers, I am reliably told, are higher than ever — and today dwarf the viewership of Jobs-era keynotes.) They’re far more interesting visually and more information-dense. (No need to wait for new presenters to walk on stage and then off again. Just cut to a new scene.) They’re more expensive to produce but the results are 100 percent predictable. No Apple keynote demo will ever fail again. But they’re also less exciting. Demo fails are fun (from the audience, not from the stage) and — as noted above, w/r/t the Wi-Fi at WWDC 2010 — memorable. And the possibility that any live demo might fail adds a degree of palpable tension to every demo, even the ones that wind up going off without a hitch. It’s like the difference between watching a live motorcycle stunt show and a Fast and Furious movie. The movie looks better, but carries no sense of actual danger. ↩︎︎

  5. Or people. ↩︎︎


Thierry Breton Resigns, Forced Out by the European Commission President 

Thierry Breton, in a letter to Ursula von der Leyen, President of the European Commission:

On 24 July, you wrote to Member States asking them to nominate candidates for the 2024-2029 College of Commissioner, specifying that Member States that intend to suggest the incumbent Member of the Commission were not required to suggest two candidates. On 25 July, President Emmanuel Macron designated me as France’s official candidate for a second mandate in the College of Commissioners — as he had already publicly announced on the margins of the European Council on 28 June. A few days ago, in the very final stretch of negotiations on the composition of the future College, you asked France to withdraw my name — for personal reasons that in no instance you have discussed directly with me — and offered, as a political trade-off, an allegedly more influential portfolio for France in the future College. You will now be proposed a different candidate.

Over the past five years, I have relentlessly striven to uphold and advance the common European good, above national and party interests. It has been an honour.

However, in light of these latest developments — further testimony to questionable governance — I have to conclude that I can no longer exercise my duties in the College.

I am therefore resigning from my position as European Commissioner, effective immediately.

Translation from bureaucratese to English: “Faced with being fired for being a jackass or resigning, I resign.”

I’m starting to get the feeling that the EC’s regulatory arm is not, in fact, politically popular in the EU.

Tiptop 

My thanks to Tiptop for sponsoring DF last week. Tiptop is a completely new way to pay that makes everything you buy more affordable with trade-in at checkout. It’s incredibly easy. At checkout, you simply select any item you own that you want to trade in from Tiptop’s catalog of over 50,000 choices, and you’ll receive instant credit towards your purchase.

If you’re a merchant, you can easily enable Tiptop with no up front costs, Shopify checkout support, and great APIs for integration with any store. Plus, they are currently offering $10,000 in Tiptop Promotional Credit that you can use to help your customers learn about Tiptop without discounting.

Get started now as a merchant, or experience Tiptop as a shopper at partners like Nothing Tech, Daylight, Cradlewise, and King of Christmas. (And if you’re thinking Tiptop rings a bell, you may recall that after leaving TechCrunch last year, Matthew Panzarino joined Tiptop, and we chatted about it when last he was on The Talk Show earlier this year.)

Dithering, and This Week’s Apple Event 

September 2024 cover art for Dithering, depicting Lynn Swann’s 53-yard circus catch during the Steelers’ 21-17 victory over the Cowboys in Super Bowl 10 in 1976.

I’m still collecting my thoughts on this week’s “It’s Glowtime” Apple event, and where Apple stands in general. But this episode of Dithering that dropped Friday morning captures my high-level thoughts well. We haven’t done this in a while, but we’re making it free for everyone to listen to. Give it a listen, while I continue to write and think. (We also have a feed of our occasional free episodes; search your podcast player for “Dithering” and it should show up.)

Dithering as a standalone subscription costs just $7/month or $70/year. You get two episodes per week, each exactly 15 minutes long, with yours truly and Ben Thompson. I just love having an outlet like Dithering for weeks like this one. People who try Dithering seem to love it, too — we have remarkably little churn.

(You can also get Dithering by subscribing to Stratechery, a bundle that includes all of Ben’s writing, his interviews, plus the Sharp Tech, Sharp China, and Greatest Of All Talk podcasts — all of that, including Dithering, for just $15/month or $150/year.)

OpenAI Releases New o1 Reasoning Model 

Kylie Robison, reporting for The Verge:

OpenAI is releasing a new model called o1, the first in a planned series of “reasoning” models that have been trained to answer more complex questions, faster than a human can. It’s being released alongside o1-mini, a smaller, cheaper version. And yes, if you’re steeped in AI rumors: this is, in fact, the extremely hyped Strawberry model.

For OpenAI, o1 represents a step toward its broader goal of human-like artificial intelligence. More practically, it does a better job at writing code and solving multistep problems than previous models. But it’s also more expensive and slower to use than GPT-4o. OpenAI is calling this release of o1 a “preview” to emphasize how nascent it is. [...]

“The model is definitely better at solving the AP math test than I am, and I was a math minor in college,” OpenAI’s chief research officer, Bob McGrew, tells me. He says OpenAI also tested o1 against a qualifying exam for the International Mathematics Olympiad, and while GPT-4o only correctly solved only 13 percent of problems, o1 scored 83 percent.

Putting aside the politics and other legitimate social and legal concerns around AI, scoring that well in a difficult math exam is just incredible.

Update: Robison wrote:

I wasn’t able to demo o1 myself, but McGrew and Tworek showed it to me over a video call this week. They asked it to solve this puzzle:

“A princess is as old as the prince will be when the princess is twice as old as the prince was when the princess’s age was half the sum of their present age. What is the age of prince and princess? Provide all solutions to that question.”

The model buffered for 30 seconds and then delivered a correct answer.

I found this puzzle pretty damn tricky, personally. I pasted it, verbatim, into ChatGPT-4o and it solved it, correctly, the first time. I pasted it into the new o1-Preview model, and it both took longer and gave me the incorrect answer. I replied to o1-Preview, “Are you sure about that answer? Can you try it again?” and this time it gave me the correct answer. Still impressive, but kind of weird that this was OpenAI’s own example puzzle intended to show off the new o1-Preview model.

Spoilers follow. Avert your eyes from the remainder of the post if you want to solve this one your own. Here’s how I solved the puzzle, with pen and paper, before pasting the puzzle into any LLMs:

Let y = the princess’s age now and x = the prince’s. Let d = the delta between princess and prince’s ages. By definition, at any given year in time, d = y - x and therefore y = x + d. (To be pedantic, d equals the absolute value of y - x but somehow it’s obvious to me, from phrase “as the prince will be”, that the princess is older than the prince.)

We care about three years:

  1. Now.
  2. When the princess is half the sum of their combined ages from year (1).
  3. When the princess is twice the prince’s age from year (2).

For (1), we know by definition that this is always true now matter what year it is: y = x + d — that is to say the princess is d years older than the prince.

For (2) we can express the princess’s age as:

(y + x) / 2

And we from (1) we know that no matter what year it is, the prince is d years younger than the princess. So during year of (2), the prince’s age can be expressed as:

((y + x) / 2) - d

and year (3) is defined as when the princess (y) is twice the above (the prince’s age from year (2)), so the princess age in year (3) can be expressed as:

2((y + x) / 2) - 2d

And in any given year, the prince’s age is the princess’s minus d, which can thus be expressed, for year (3), by subtracting one more d from the line above:

2((y + x) / 2) - 3d

Cancelling out those 2’s:

y + x - 3d

That is the prince’s age for year (3). The puzzle’s definition is that princess’s age now (y) is the same as prince’s in year (3), the line above. So we can form an equation:

y = y + x - 3d

Those y’s cancel out, so we are left with:

x = 3d

And by definition y is always x + d (the prince’s age plus their age difference), so:

y = 4d

So for any given difference (d) in their ages, the prince must be 3 times d and the princess 4 times d:

DifferencePrincess = 4dPrince = 3d
143
286
3129
41612

So a generalized solution are any ages where the princess is 4/3 the age of the prince. I double-checked this mentally by applying all the clauses of the puzzle to the princess and prince’s ages in each line of the table above.

That’s my answer and my thinking. Here’s a link to my ChatGPT transcript. It’s all one chat, with my first pasting of the puzzle sent to GPT-4o, and all my subsequent comments (including the second pasting of the puzzle) being sent to o1-Preview.

FDA Grants Approval to AirPods Pro 2 for Use as Hearing Aids 

Brian Heater, reporting for TechCrunch:

The iPhone 16 took center stage at Apple’s “It’s Glowtime” event, but the most interesting tidbit came from a different line entirely. Indeed, among a sea of new hardware came an intriguing software update to one already on the market: the AirPods Pro 2.

Apple announced that its most premium earbuds would double as an over-the-counter hearing aid, courtesy of a software update, pending approval from the U.S. Food and Drug Administration.

The FDA on Thursday announced that it has granted what it calls “the first over-the-counter (OTC) hearing aid software device, Hearing Aid Feature.” Specifically, it has approved the software update that enables that functionality.

In briefings on Monday, Apple employees expressed what I can only describe as confidence that FDA approval for this would be imminent, but like sports fans, it was almost as though they didn’t want to jinx it. Asked if FDA approval might come before the iOS 18.0 and MacOS 15.0 updates scheduled for this coming Monday, they wouldn’t really answer, but had looks on their faces that said that’s what we’re hoping.

What a great feature this seems to be.