By John Gruber
1Password — Secure every sign-in for every app on every device.
Todd Heberlein:
Cozy mysteries are a genre of crime fiction where the stories take place in small, socially intimate communities, and any violence is limited or happens offscreen. Yesterday, I experienced a “Cozy WWDC,” and it was wonderful!
The event took place in an intimate setting with about 170 developers. There were no highly produced skits, no jabs at the competition, no speculative non-existent products designed to make the media and influencers lose their shit, and no media. The event, titled “Envision the Future: Build Great Apps for visionOS,” was held at the Apple Developer Center in Cupertino on October 2nd.
It focused solely on visionOS and spanned just one day.
The presenters were live. Many wrote code and showed the results live. Sometimes demos didn’t work the first time.
I have heard from a few other attendees that this was an excellent and very productive little event.
Panic:
Well, Google has a new set of policies that require apps that connect to Google Drive to go through expensive, time-consuming annual reviews, and this has made it extremely difficult for us to reasonably maintain Google Drive access. You may have seen iA Writer’s announcement that they are stopping development of their Android version for similar reasons. Our experience was different, but our circumstances are similar. [...]
Between the weeks of waiting, submitting the required documentation and the process of scanning the code, it took a significant amount of time from our engineers. For example, Google provided a Docker image for running the scanner, but it didn’t work. We had to spend more than a week debugging and fixing it. And because the scanner found no problems, it didn’t result in any improvements to Transmit. No one benefitted from this process. Not Google, not Panic, and not our users. [...]
But then… a couple of months later, Google completely removed the option for us to scan our own code. Instead, to keep access to Google Drive, we would now have to pay one of Google’s business partners to conduct the review. They promised a discounted minimum price, but no maximum price. We realized that either we’d most likely be paying someone else a chunk of cash to run the same scanner we were running, or our bill would end up much higher.
Never been gladder that I don’t use Google Drive for anything.
Peter Baker and Dylan Freedman, reporting for The New York Times, with the conspicuous absence of Maggie Haberman from that shared byline:
Former President Donald J. Trump vividly recounted how the audience at his climactic debate with Vice President Kamala Harris was on his side. Except that there was no audience. The debate was held in an empty hall. No one “went crazy,” as Mr. Trump put it, because no one was there.
Anyone can misremember, of course. But the debate had been just a week earlier and a fairly memorable moment. And it was hardly the only time Mr. Trump has seemed confused, forgetful, incoherent or disconnected from reality lately. In fact, it happens so often these days that it no longer even generates much attention.
He rambles, he repeats himself, he roams from thought to thought — some of them hard to understand, some of them unfinished, some of them factually fantastical. He voices outlandish claims that seem to be made up out of whole cloth. He digresses into bizarre tangents about golf, about sharks, about his own “beautiful” body. He relishes “a great day in Louisiana” after spending the day in Georgia. He expresses fear that North Korea is “trying to kill me” when he presumably means Iran. As late as last month, Mr. Trump was still speaking as if he were running against President Biden, five weeks after his withdrawal from the race.
Better late than never, but if it were Joe Biden who had rambled on about “the audience going crazy” at a debate that had no audience, the New York Times would have been all over it the next day, not a month later.
I don’t think Donald Trump was ever hooked up right. But he’s clearly losing the few marbles he ever had to dementia, just like his father did. The signs were clear during his 2017–2021 term in office:
John F. Kelly, his second White House chief of staff, was so convinced that Mr. Trump was psychologically unbalanced that he bought a book called “The Dangerous Case of Donald Trump,” written by 27 mental health professionals, to try to understand his boss better. As it was, Mr. Kelly came to refer to Mr. Trump’s White House as “Crazytown.”
Of course the Times had to both-sides this story, and this is who they found to do it:
Sam Nunberg, a former Trump political adviser, said he still talked with people who see him almost daily, and had not heard of any concerns expressed about the former president’s age. “I don’t really see any major difference,” he said. “I just don’t see it.”
Nunberg is the guy who showed up shitfaced drunk on half a dozen cables news appearances at the height of the Robert Mueller investigation. That’s the guy saying, sure Trump is OK in the head today.
If you haven’t watched Trump speak in a while — because you’re on team “fuck that guy”, like any sane voter — you should watch the video clips the Times culled for this piece. Like I said, I don’t think the guy was ever hooked up right, but it’s very clear he’s in serious decline today.
My suggestion to the Harris campaign is that they should repeatedly describe Trump as “an 80-year-old”, and force Trump surrogates to correct them that he’s “only” 78.
Joe Rossignol, writing for MacRumors:
The latest video of what could be a next-generation MacBook Pro was shared on YouTube Shorts today by Russian channel Romancev768, just one day after another Russian channel shared a similar video. The clip shows a box for a 14-inch MacBook Pro that is apparently configured with an M4 chip with a 10-core CPU and a 10-core GPU, 16GB of RAM, 512GB of storage, three Thunderbolt 4 ports, and a Space Black finish. [...]
The source of these leaks is unclear. Last week, “ShrimpApplePro” claimed that at least one of the unannounced 14-inch MacBook Pro units was apparently being offered for sale in a private Facebook group. In a follow-up post on X on Sunday, the leaker claimed that he saw someone online who was apparently advertising 200 of the unannounced 14-inch MacBook Pro units for sale, leading him to believe this leak originates from a warehouse. It is unclear if these details are accurate, but this whole situation is clearly very sketchy.
It’s somewhat weird that the box art is identical to that of last year’s M3 MacBook Pros, but I lean toward thinking these are real. Best guess is that someone stole 200 of these from China and some or all of them wound up in Russia? No sympathy for Apple here if that’s what happened — if you assemble your products in a dictatorship, stuff like this is bound to happen. Kinda surprising it hasn’t happened with iPhones, which would garner far more attention and value a month ahead of launch. That it hasn’t happened with iPhones probably indicates that Apple puts more security around them than they do MacBook Pros.
Juli Clover, MacRumors:
In the release notes for the sixth beta of the macOS Sequoia 15.1 update, Apple says that users aren’t going to see as many popups for apps they regularly use.
Applications using our deprecated content capture technologies now have enhanced user awareness policies. Users will see fewer dialogs if they regularly use apps in which they have already acknowledged and accepted the risks.
Why in the world didn’t Apple take regular use of a screen-recording app into account all along?
Tyler Stalman joins the show to discuss the iPhone 16 lineup’s cameras, and the state of iPhone photography.
Sponsored by:
Sean Hollister, writing for The Verge:
Google’s Android app store is an illegal monopoly — and now it will have to change. Today, Judge James Donato issued his final ruling in Epic v. Google, ordering Google to effectively open up the Google Play app store to competition for three whole years. Google will have to distribute rival third-party app stores within Google Play, and it must give rival third-party app stores access to the full catalog of Google Play apps, unless developers opt out individually.
These were Epic’s biggest asks, and they might change the Android app marketplace forever — if they aren’t immediately paused or blocked on appeal. And they’re not all that Epic has won today. Starting November 1st, 2024, and ending November 1st, 2027, Google must also:
- Stop requiring Google Play Billing for apps distributed on the Google Play Store (the jury found that Google had illegally tied its payment system to its app store)
- Let Android developers tell users about other ways to pay from within the Play Store
- Let Android developers link to ways to download their apps outside of the Play Store
- Let Android developers set their own prices for apps irrespective of Play Billing
If this ruling holds on appeal, it’s a real loss for Google, not a token loss.
Update: Regarding the bit in the first paragraph above, about rival app stores getting access to all apps in the Play Store unless the developers opt out, I was originally confused how this could possibly work. I should have read the injunction first. It states:
For a period of three years, Google will permit third-party Android app stores to access the Google Play Store’s catalog of apps so that they may offer the Play Store apps to users. For apps available only in the Google Play Store (i.e., that are not independently available through the third-party Android app store), Google will permit users to complete the download of the app through the Google Play Store on the same terms as any other download that is made directly through the Google Play Store. Google may keep all revenues associated with such downloads. Google will provide developers with a mechanism for opting out of inclusion in catalog access for any particular third-party Android app store. Google will have up to eight months from the date of this order to implement the technology necessary to comply with this provision, and the three-year time period will start once the technology is fully functional.
This is far less radical a dictum than Hollister’s description led me to believe. What Judge Donato is demanding is effectively pass-through to the actual Play Store listing for any apps and games that aren’t available in a third-party app store. So if you search in the Brand X app store for “FooApp” but FooApp isn’t available in the Brand X store, Brand X’s store app can let you install and download FooApp from the Play Store. But that counts as a regular Play Store installation. It’s just a way to encourage users of third-party stores to search those stores first, even though the vast majority of apps will likely remain exclusively in the Play Store.
Sarah Krouse, Dustin Volz, Aruna Viswanatha, and Robert McMillan, reporting for The Wall Street Journal:
For months or longer, the hackers might have held access to network infrastructure used to cooperate with lawful U.S. requests for communications data, according to people familiar with the matter, which amounts to a major national security risk. The attackers also had access to other tranches of more generic internet traffic, they said. Verizon Communications, AT&T and Lumen Technologies are among the companies whose networks were breached by the recently discovered intrusion, the people said.
The widespread compromise is considered a potentially catastrophic security breach and was carried out by a sophisticated Chinese hacking group dubbed Salt Typhoon. It appeared to be geared toward intelligence collection, the people said. [...]
The surveillance systems believed to be at issue are used to cooperate with requests for domestic information related to criminal and national security investigations. Under federal law, telecommunications and broadband companies must allow authorities to intercept electronic information pursuant to a court order. It couldn’t be determined if systems that support foreign intelligence surveillance were also vulnerable in the breach.
This incident should henceforth be the canonical example when arguing against “back doors for the good guys” in any networks or protocols. It’s not fair to say that all back doors will, with certainty, eventually be compromised, but the more sensitive and valuable the communications, the more likely it is that they will. And this one was incredibly sensitive and valuable. There are downsides to the inability of law enforcement to easily intercept end-to-end encrypted communication, but the potential downsides of back doors are far worse. Law enforcement is supposed to be hard work.
We should rightfully blame China first for this attack — and the U.S. government ought to start treating such attacks by China as part of the second Cold War that they are, and retaliate in big ways — but secondary blame must go to Congress for passing the Communications Assistance for Law Enforcement Act (CALEA) in 1994, and to the FCC for broadening its interpretation a decade later. Verizon, AT&T, and the other companies whose networks were breached were — and remain — required by law to provide the back doors that the Chinese hackers exploited.
John Naughton, writing for The Guardian:
Once the use of RSS feeds had become common, someone had the idea that audio files could be attached to them, and Dave implemented the idea with a nice geeky touch — attaching a song by the Grateful Dead. Initially the new technology was called audio blogging, but eventually a British journalist came up with the term “podcasting” and it stuck.
So Dave was present at the creation of some cool stuff, but it was blogging that brought him to a wider public. “Some people were born to play country music,” he wrote at one stage. “I was born to blog. At the beginning of blogging I thought everyone would be a blogger. I was wrong. Most people don’t have the impulse to say what they think.” Dave was the exact opposite. He was (and remains) articulate and forthright. His formidable record as a tech innovator meant that he couldn’t be written off as a crank. The fact that he was financially secure meant that he didn’t have to suck up to anyone: he could speak his mind. And he did. So from the moment he launched Scripting News in October 1994 he was a distinctive presence on the web.
One of Winer’s numerous aphorisms that resonates deeply with me: People return to places that send them away.
Dave Winer:
Today is the 30th anniversary of this blog. Hola!
I did a roundup of thoughts when this blog turned 25. I stand by what I wrote then, but I’d add this. My blog started because I needed content to test a script I had written that sent emails on my Mac using Eudora, which was an early scriptable app and I had a nice scripting system that worked with it. I looked around for something to send (30 years ago today), and shot out an email to the people whose business cards I had collected at various tech conferences. It was a thrill, so I did it again, and again and three more times, before I realized hey I could use this thing to get my own ideas out there. And thus began this thing that I still do to this day. Look at the two posts I wrote about WordPress in the last few days. There may be hope to find a blogosphere buried somewhere in there. And it may be possible to give them some sweet new writing tools so they can get excited about writing on the web the way we did all those years ago. I actually am kind of optimistic about that. Maybe we can stand up something in the midst of the noise. When we booted up podcasting, approx 20 years ago, we had a slogan — “Users and developers party together.” It worked! That is still the way I want to build stuff, it’s the only way I know how to do it. Blogging started out as a programming adventure and eventually became a form of literature. How about that. I’m up for doing more of that if you all are. But please expect to make contributions, don’t expect it all to come to you for free, because as we know nothing really is free.
Winer is rightfully renowned for his technical achievements — outliners as an application genre, RSS in general, and RSS in the specific context of podcasting in particular — but what’s kept me reading Scripting News for the entirety of Scripting News’s 30-years-and-counting run is his writing. He has such a distinctive writing voice that is impossible to imagine in any medium other than the web. But I think that’s because he helped define what writing not just on the web, but for the web, even meant.
Thanks for it all, Dave.
Aaron Vegh and Ben Rice McCarthy (of Obscura renown) have teamed up to create Croissant, a new app — currently iPhone-only — for cross-posting to Mastodon, Threads, and Bluesky. 15 years ago I wrote “Twitter Clients Are a UI Design Playground” and that piece stands up, but it’s not Twitter/X in particular (certainly not anymore — X support is conspicuously omitted from Croissant’s current lineup up supported platforms), but tweet-like platforms in general. Croissant proves that this domain remains a UI playground. It’s both visually distinctive and intuitively familiar, with a fun and fluid UI. It’s the sort of app that I want to find reasons to use.
Free to download and try with a single account; $3/month, $20/year, or $60 as a one-time purchase for multi-account support, which is where Croissant really shines.
See also: Dan Moren at Six Colors, John Voorhees at MacStories, and Nick Heer at Pixel Envy.
My thanks to WorkOS for, once again, sponsoring the week at Daring Fireball. WorkOS is a modern identity platform for B2B SaaS. Start selling to enterprise customers with just a few lines of code. Ship complex features like SSO and SCIM (pronounced skim) provisioning in minutes instead of months.
Today, some of the fastest growing startups are already powered by WorkOS, including Perplexity, Vercel, and Webflow.
For SaaS apps that care deeply about design and user experience, WorkOS is the perfect fit. From high-quality documentation to self-serve onboarding for your customers, it removes all the unnecessary complexity for your engineering team.
Another good overview of the Automattic/WP Engine saga, this one from Ari Levy at CNBC:
Mullenweg may be openly enthusiastic and grateful for the employees he still has on board, but the WordPress community is a mess. Many WP Engine customers are suffering, and Automattic is gearing up for a legal fight against a private equity firm with over $100 billion in assets.
Hard not to be reminded, somewhat, of the righteous indignation fueling Steve Jobs’s end of life crusade against Google for creating Android. Some big fundamental differences, of course. WordPress is GPL open source and iOS isn’t open at all. It’s the righteous fervor of the founder/leader of the company that’s reminiscent.
Emma Roth does the yeoman’s work of summarizing the complex and fast-moving legal feud between WordPress’s commercial arm and WP Engine, a major WordPress hosting provider:
Over the past several weeks, WordPress cofounder Matt Mullenweg has made one thing exceedingly clear: he’s in charge of WordPress’ future.
Mullenweg heads up WordPress.com and its parent company, Automattic. He owns the WordPress.org project, and he even leads the nonprofit foundation that controls the WordPress trademark. To the outside observer, these might appear to be independent organizations, all separately designed around the WordPress open-source project. But as he wages a battle against WP Engine, a third-party WordPress hosting service, Mullenweg has muddied the boundaries between three essential entities that lead a sprawling ecosystem powering almost half of the web.
To Mullenweg, that’s all fine — as long as it supports the health of WordPress long-term.
See also: Mullenweg’s “alignment” offer to Automattic’s nearly 1,900 employees.
Taegan Goddard, writing at Political Wire:
It’s worth recalling that a major reason Trump won in 2016 was that, just before the election, news broke about emails related to a closed investigation into Hillary Clinton’s emails being found on Anthony Weiner’s computer, the estranged husband of a top Clinton aide.
In the end, nothing came of this discovery, but the extensive news coverage of it almost certainly swayed the election. It was the top story in every major newspaper.
But this new evidence presented against Trump wasn’t even the lead story in the New York Times or Washington Post this morning. And it didn’t even make the front page of the Wall Street Journal or USA Today.
It’s true that millions of words have already been written about Trump’s attempt to overturn the 2020 election. But there was plenty of new information included in this filing which is directly relevant to the biggest news story this month.
This, I think, is entirely explained by the conventional wisdom that the U.S. news media is “liberal”, a decades-long work-the-refs strategy from Republicans. The truth is the news media is effectively in the tank for Trump, sanewashing his literal nonsense, outright lies, and violence-inspiring hate speech against even legal immigrants. But our major political news media remains so hyper-focused on appearing not to favor one political side over the other that it’s completely lost sight of what ought to be their north star: the truth. If the truth favors one party over the other, so be it. That’s the job of reporting the news.
The difference between how these same publications treated Hillary Clinton’s “but her emails” nonsense in 2016 compared to Jack Smith’s motion this week could not be more stark.
Update: If you prefer, imagine if a special counsel appointed by the Attorney General submitted a brief alleging any crimes at all committed by Kamala Harris. Let’s say personal tax evasion — crimes, but insignificant compared to multiple attempts to overthrow the results of the last presidential election. The major U.S. newspapers and cable channels would have covered nothing else in the days since. Yet for this brief laying out copious evidence that Trump attempted the worst crime imaginable against U.S. democracy itself, it’s relative crickets chirping and shoulder shrugs. Remember too that Trump is already a convicted felon. If Harris had been convicted of a felony this year, do you think it would mentioned more frequently in news stories than it actually is for Trump? If you don’t, I have a bridge to sell you.
I missed this announcement from MLB a month ago:
Major League Baseball today announced a new multi-year international partnership with European workwear leader STRAUSS that makes the German company the Official Workwear Partner of MLB. The partnership marks STRAUSS’ first league-wide deal in the United States. STRAUSS entered the U.S. market in late 2023, and American brand awareness is the cornerstone of its marketing efforts. The new partnership also affords STRAUSS marketing rights with MLB across Canada, Mexico and Europe. As part of the deal, STRAUSS’ name and logo will adorn MLB batting helmets during the Postseason and regular season games in Europe, as well as MiLB batting helmets all season long, beginning in 2025.
But I couldn’t miss it watching postseason games on TV this week: there’s a ridiculous-looking “Strauss” on both sides of every player’s batting helmet now, as prominent as the team logo on the front. It looks even more desperate and obsequious on the helmets than it does printed in all-caps in MLB’s bootlicking press release. This is the sort of gimmick you expect from a struggling independent minor league team, not Major League Baseball.
They should’ve put the rights to these on-helmet ads up for public auction. I’d have chipped in for a fan-backed initiative to buy that on-helmet ad space to affix this slogan: “FIRE ROB MANFRED”.
Victoria Gomelsky, reporting with absurd credulity for The New York Times:
Hodinkee, the watch enthusiast website based in Manhattan that has helped spread the gospel of mechanical watchmaking since its founding in 2008, has a new owner.
On Friday, the Watches of Switzerland Group, one of the world’s largest watch retailers with more than 220 multibrand and brand stores in Britain and the United States, announced that it had acquired the media company, which includes a website, a magazine, a brand partnerships division and an insurance business. Neither company would disclose the terms of the deal. [...]
Both Mr. Clymer and Mr. Hurley said Hodinkee’s staff, which now totals about 35 people, would remain intact and that its editorial team would remain independent of Watches of Switzerland oversight.
“But at a point in time,” Mr. Hurley said, “when you click on the Hodinkee Shop, you will see the full range of the product that WatchesofSwitzerland.com carries. We are going to do some work over the next several months to make that effectively seamless.”
There is a name for a publication that is owned by a retailer: catalog. I’d love to be proven wrong and see Hodinkee return to excellence, but that seemed far more likely as an independent website than as a subsidiary of the world’s largest premium watch retailer. For years I read Hodinkee daily; for the last few years I largely stopped reading it at all. Here’s Clymer’s own column announcing the acquisition (“joining forces”) and his return to day-to-day leadership of the site.
An important follow-up to yesterday’s item about Russia demanding Apple remove VPN apps from the Russian App Store: you can use a VPN on iOS without an app. It just requires some futzing in Settings and a VPN provider that supports it. Presumably, this technique remains available to iPhone users in Russia. Here are instructions from one such VPN provider, ForestVPN:
- Access Settings:
- Open the Settings app on your iPhone.
- Tap on General and scroll to VPN & Device Management.
- Add VPN Configuration:
- Select Add VPN Configuration.
- Choose your desired protocol, such as L2TP or IKEv2.
- Enter VPN Details:
- Fill in the necessary fields like Description, Server, Remote ID, and Local ID. These details can be found on the ForestVPN website.
- Save and Connect:
- Tap Done to save your configuration.
- Enable the VPN by toggling the switch next to your newly created profile.
VPN apps remove complexity from this process, but it’s worth noting that VPN access doesn’t require an app.
Chili Palmer, reporting for HighSpeedInternet:
Starlink announced on Oct. 2 it will offer one month of free internet in Hurricane Helene disaster areas. The free service will be available to new customers who order through the Starlink website and to customers who activate a kit they already have, whether it was donated or purchased at a retail store. Existing customers may also be eligible.
The announcement comes after more than 500 Starlink kits were distributed throughout the disaster area by private relief organizations.
It’s hard to overstate how differently Elon Musk would be perceived if he weren’t a whackjob on political and cultural issues.
Ryan Christoffel, writing for 9to5Mac:
Hurricane Helene has caused massive damage and taken over 100 lives across several US states. Many thousands of people are without power and/or cell service. But in the wake of the storm, reports have surfaced about a key iOS 18 feature that has been a lifeline for survivors: Messages via satellite.
Apple added Messages via satellite to millions of iPhones via its recent iOS 18 update. And now, according to reports on social media, it seems the feature arrived just in time. Here are a few tweets highlighting how useful the feature has proven.
It’s great that iOS 18 shipped before Helene hit, but a shame that it’s so new that most users haven’t yet upgraded. And once Helene hit and knocked out all comms in the most severely-hit areas, it was too late. (Apple hasn’t yet pushed iOS 18 to the majority of users whose devices are set to install updates automatically, and typically doesn’t do so with new iOS versions until the .1 release in October or November.) Some heads-up people were specifically recommending that iPhone 14 and 15 users in Helene’s path update to iOS 18 before it hit specifically to get this feature. But still: the feature is already making a huge difference.
Cool Hunting:
We love getting into the nerdy details of design innovations and the iPhone 16‘s new Camera Control button presented a perfect opportunity to dig in. For this first podcast of our new Design Tangents series aptly named Nerdy Details we sit down with Johnnie Manzari from the Apple Human Interface team and Rich Dinh, Senior Director of Product Design, to talk about cameras and photography through the lens of the new control on “the world’s most popular camera.”
You don’t often get to hear Apple employees speak about their work. When you do, it’s often largely about trying to get the feel right.
Zac Hall, 9to5Mac:
iPhone users are being notified about an excessive heat weather event through Apple’s Weather app on iPhone. While the weather event is happening in the Santa Clara Valley region of California, the alert says that the occurrence is happening in an area nearby regardless of where you live.
Hall had a good theory — that the warnings were being to delivered to people who live nowhere near Santa Clara Valley because Apple includes Cupertino as a default location for the Weather app — but in an update acknowledges that the warning notification is being received by users who don’t have any saved locations near the heat wave. (I’ve gotten the notification on multiple devices, and don’t have Cupertino saved as a Weather location.)
What a weird bug.
The United States Attorney’s Office for the District of Columbia:
Haotian Sun, 34, and Pengfei Xue, 34, both Chinese nationals, were sentenced today for participating in a sophisticated scheme to defraud Apple Inc. out of millions of dollars’ worth of iPhones. U.S. District Court Judge Timothy J. Kelly sentenced Sun to 57 months in prison, and sentenced Xue to 54 months in prison. [...]
According to the government’s evidence, between May 2017 and September 2019, Sun, Xue, and other conspirators defrauded Apple Inc. by submitting counterfeit iPhones to Apple Inc. for repair to get Apple to exchange them with genuine replacement iPhones. Sun and Xue received shipments of inauthentic iPhones from Hong Kong at UPS mailboxes throughout the D.C. metropolitan area. They then submitted the fake iPhones, with spoofed serial numbers and/or IMEI numbers, to Apple retail stores and Apple Authorized Service Providers, including the Apple Store in Georgetown. Trial evidence and evidence developed after trial showed that members of the conspiracy submitted more than 6,000 inauthentic phones to Apple during the conspiracy, causing an intended loss of approximately $3.8 million and an actual loss of more than $2.5 million.
This seems like a scam you might expect to get away with a few times. Maybe more than a few, if you keep taking the counterfeit iPhones to different stores. But 6,000?
Novaya Gazeta Europe:
Apple removed nearly 60 additional virtual private network (VPN) apps from its Russia App Store between July and September, significantly more than the 25 acknowledged by the Russian authorities, according to a report published on Tuesday by the Apple Censorship Project, which campaigns for greater transparency from Apple over such moves.
According to researchers at GreatFire, an organisation which monitors online censorship in China, data indicates that Apple silently removed nearly 60 VPN services from the Russia App Store between 4 July and 18 September, bringing the total number of VPN apps now unavailable in the country to 98.
The report suggests that the scale of online censorship in Russia is much greater than was previously acknowledged when Roskomnadzor, Russia’s media regulator, announced in early July that it would be blocking 25 VPN apps in the Russian App Store, including some of the world’s most popular services such as NordVPN, ExpressVPN and Proton VPN.
The kneejerk criticism to purges like this is to fault Apple for complying. But of course they have to comply. If Apple responded to this demand from the Russian government with “Nah, we’re not going to comply”, the Russian government would shut down the App Store in Russia. It’s the same reason Apple can’t just say “Nah” to complying with the DMA in the EU even though the company staunchly disagrees with the entirety of the DMA’s requirements. The law’s the law, whether the country is a brutal dictatorship or a liberal democracy.
The correct criticism to target at Apple is that this is the best argument against the App Store as the sole distribution channel of software for iOS. VPN software is still available for the Mac in Russia, and, I presume, is still available via sideloading for Android phones. When you create a choke point, you can be choked.
Christian Selig:
For those not aware, a few months ago after reaching out to me, YouTube contacted the App Store stating that Juno does not adhere to YouTube guidelines and modifies the website in a way they don’t approve of, and alludes to their trademarks and iconography.
I don’t personally agree with this, as Juno is just a web view, and acts as little more than a browser extension that modifies CSS to make the website and video player look more “visionOS” like. No logos are placed other than those already on the website, and the “for YouTube” suffix is permitted in their branding guidelines. Juno also doesn’t block ads in any capacity, for the curious.
I stated as much to YouTube, they wouldn’t really clarify or budge any, and as a result of both parties not being able to come to a conclusion I received an email a few minutes ago from Apple that Juno has been removed from the App Store.
This, to say the least, sucks. Juno is a wonderful VisionOS app — one of the very best third-party apps for the platform. It turns YouTube video watching from a totally meh experience inside Safari into a totally wow experience as a native app. It’s not like Juno was keeping people from using YouTube’s own native app because, famously, there isn’t one. A YouTube spokesperson told Nilay Patel at The Verge back in February that “a Vision Pro app is on our roadmap”, but as I wrote at the time, “given the design quality and adherence to platform design idioms of Google’s iOS apps (poor), I’m not sure they’re even capable of making a Juno-quality app.”
I still stand by that. I don’t expect to see YouTube launch a native VisionOS app soon, and even if they do, I doubt it’ll be anywhere near as good as Juno. What I was obviously wrong about in that February post was thinking that YouTube wouldn’t care about Juno’s existence, given that Juno did not block ads. All it did was make the YouTube experience great on Vision Pro.
This makes Selig — one of the most gifted indie developers working on Apple’s platforms today — 2 for 2 on getting hosed by big platforms for which Selig created exquisitely well-crafted clients. (The first, of course, was his beloved Reddit client Apollo.) If he goes 3 for 3, Phil Schiller should grant him a “trifecta” lifetime exemption from App Store commission fees.
The AP:
Technology reporter Taylor Lorenz said Tuesday that she is leaving The Washington Post, less than two months after the newspaper launched an internal review following her social media post about President Joe Biden.
Lorenz, a well-regarded expert on internet culture, wrote a book “Extremely Online” last year and said she is launching a newsletter, “User Mag,” on Substack.
Well-regarded by whom? Lorenz is a hack — a self-proclaimed social media expert done in by her own “private” Instagram post describing President Joe Biden as a “war criminal” that she subsequently lied about having posted. She didn’t “exit” the Post. She was obviously and rightfully fired.
This video from “MTT” warmed my heart. And that takes a lot. I learned Pascal on this keyboard. I absolutely loved this keyboard when I first encountered it. But, today, man, what a weird keyboard it is. I mean the arrow-key layout is one thing (up, down, left, right — arranged horizontally). But how about putting the backslash (\) key on the right of the space bar and the backtick (`) key on the left? I mean that’s just crazy. I recall absolutely loving the feel of this keyboard as a teenager but I’ve never bothered chasing one down in my adult life because I know today I could never bear the weird layout. But MTT didn’t just do the lazy thing (buy an ADB-USB adapter), he went the whole nine yards and designed and soldered his own custom parts to turn this 1986 gem into a modern day Bluetooth keyboard. Masterful.
Jason Snell returns to the show to discuss Apple’s September product announcements, and Meta’s Orion prototype AR glasses. Absolutely no baseball talk, almost.
Sponsored by:
Allen Pike:
As I understand it, my first experience in a self-driving car was typical:
- Minute 1: “How safe is this? Will it notice that cyclist? What about those construction cones?”
- Minute 10: “This is wild. It’s driving so calmly and safely. I love it.”
- Minute 20: (Bored, checking my email in the back seat.)
It was like a firmware update to my brain.
Imagine how exhilarating subways must have been a century ago — zipping across cities in high-speed underground trains. All technology becomes mundane quickly. It’s kind of amazing when you notice it happening with yourself.
Om Malik, after Apple’s September 9 “It’s Glowtime” event at Steve Jobs Theater:
I decided to become a fly on the wall and chronicle the spectacle unfolding in front of me. I focused on those who were there to create content about the devices, not the devices themselves. It was fun to just float among the crowds with my Nikon Zf and a 40mm lens.
It was a wonderful spectacle — just to bask in this new kind of raw media energy. Content for the sake of content. Events for the sake of content. Fog of content. It’s the new way of the world. As a student of media, I love this chaos and change — because from chaos and change comes the future.
I’m linking to this photo essay despite, not because of, the fact that it includes a portrait of yours truly dicking around on his phone in the small room where the media wait for post-event briefings. Steve Jobs Theater is a beautiful and unique space, but there are aspects of the space that are hard to capture in photos. Om’s collection here captures the feel of it.
I tried to return the favor by photographing the photographer.
See also: Om’s thoughts on the event and announcements.
The Cincinnati Enquirer:
Pete Rose, the Cincinnati native who became baseball’s all-time hits leader as well as one of the most divisive figures in the sport’s history, died Monday, according to a TMZ report, which was confirmed by his agent, Ryan Fiterman. He was 83.
After reaching the pinnacle of the sport he loved, Rose was banned from baseball in 1989 for gambling while manager of his hometown Reds. That came just four years after Rose had broken Ty Cobb’s hit record, a mark that still stands. He is MLB’s all-time hits leader with 4,256.
Even putting aside the betting scandal, Rose was, by all accounts, a rotten person — peculiar at best. But he was an astonishingly good and captivating baseball player, with a nickname for the ages: Charlie Hustle. He played with a maniacal intensity. When he drew a walk, he’d sprint to first base, because that’s the only way he knew how to traverse the bases: at full speed. He drew 1,566 walks in his career. I met him once, during his post-baseball career selling autographs at Las Vegas sports memorabilia shops. My favorite Rose play wasn’t a hit. It was this catch in game 6 of the 1980 World Series.
Simon Willison:
Audio Overview is a fun new feature of Google’s NotebookLM which is getting a lot of attention right now. It generates a one-off custom podcast against content you provide, where two AI hosts start up a “deep dive” discussion about the collected content. These last around ten minutes and are very podcast, with an astonishingly convincing audio back-and-forth conversation.
Here’s an example podcast created by feeding in an earlier version of this article (prior to creating this example).
I listened to the whole 15-minute podcast this morning. It was, indeed, surprisingly effective. It remains somewhere in the uncanny valley, but not at all in a creepy way. Just more in a “this is a bit vapid and phony” way. I think that if you played this example podcast for a non-technical person who isn’t informed at all about the current state of generative AI, that they would assume for the first few minutes, without question, that this was a recorded podcast between two actual humans, and that they might actually learn a few things about generative AI. But given that the “conversation” is literally about creating artificial podcasts like this very example, I wonder how many would, by the end, suspect that they were in fact listening to an AI-generated podcast? It’s quite meta — which the male voice on the podcast even says during the episode.
But ultimately the conversation has all the flavor of a bowl of unseasoned white rice. Give it a listen, though. It’s remarkable.
Update: Jiminy Christ, listen to this one, where the prompt was a document with nothing more than the words poop and fart repeated over and over.
My thanks to Method Financial for sponsoring last week at Daring Fireball. Method Financial’s authentication technology allows instant access to a consumer’s full liability portfolio using just personal information and consent, eliminating the need for usernames and passwords.
With just a few lines of code, Method’s APIs enable real-time, read-write, and frictionless access to all consumer liability data with integrated payment rails. Method leverages integrations with over 15,000 financial institutions to stream up-to-date, high-fidelity data from users’ accounts and to facilitate payment to them.
Method has helped 3 million consumers connect over 24 million liability accounts at companies like Aven, SoFi, Figure, and Happy Money, saving borrowers millions in interest and providing access to billions of dollars in personalized loans.
Jason Snell, Six Colors:
These are companies playing the same game, but in different ways. Who’s ahead? I would argue that it’s impossible to tell, because if Apple had a product like Orion we would never see it. We can argue about whether Apple’s compulsion to never, ever comment on unannounced products is beneficial or not, but it’s a Steve Jobs-created bit of Apple personality that is very unlikely to be countermanded any time soon.
Here’s how tenuous the Orion prototype is. Meta claims it would cost $10,000, but they haven’t said whether that would be the cost of goods or the retail price. But let’s give them the benefit of the doubt and say that the retail price would be just $10,000 if they brought this to market today. That’s expensive. But it’s not ridiculous. You can buy high-end workstation-class desktops that cost that much. A fully-specced 16-inch MacBook Pro costs $7,200.
But according to The Verge, these Orion prototypes only get 2 hours of battery life. And they’re too thick and chunky. You look weird, if not downright ugly, wearing them. So Meta not only needs to bring the price down by a factor of at least 3× (which would put it around the $3,500 price of Vision Pro, which most critics have positioned as too expensive), they also need to make the glasses smaller — more svelte — while increasing battery life significantly. Those two factors are in direct contradiction with each other. The only easy way to increase battery life is to put a bigger battery in the device, which makes the device itself thicker and heavier. (See this year’s iPhone 16 Pro.)
Orion by all accounts is a really compelling demo. But it’s also very clearly a prototype device only suitable for demos. Even at $10,000 retail it wouldn’t be compelling today. Yet somehow Meta wants us to believe they have “line of sight” to a compelling consumer product at a compelling price.
It’s exciting that they showed Orion publicly, but I don’t think it helped Meta in any way going forward. There’s a reason why Apple didn’t show off a prototype iPhone in 2004.
Every September, the whole extended family at Relay FM raises money for St. Jude Children’s Research Hospital, one of the most amazing institutions in the world. St. Jude is dedicated to curing childhood cancer and helping families affected by it. Since 2019 Relay has raised over $3 million, and their best-ever single month was just north of $775,000.
This year they’re already at $925,000, within earshot of a cool million, with three days to go in the month. Let’s make that happen.
Update, 30 September: And, boom, they hit it: $1,041,913.31 and still counting.
In the midst of recording last week’s episode of The Talk Show with Nilay Patel, I offhandedly mentioned the age-old trick of holding down the Shift key while minimizing a window (clicking the yellow button) to see the genie effect in slow motion. Nilay was like “Wait, what? That’s not working for me...” and we moved on.
What I’d forgotten is that Apple had removed this as default behavior a few years ago (I think in MacOS 10.14 Mojave), but you can restore the feature with this hidden preference, typed in Terminal:
defaults write com.apple.dock slow-motion-allowed -bool YES
Then restart the Dock:
killall Dock
Or, in a single command:
defaults write com.apple.dock slow-motion-allowed -bool YES; killall Dock
Or, if you prefer a proper app to a command-line invocation, Marcel Bresink’s excellent TinkerTool.
Tom Pritchard, writing at Tom’s Guide:
We put the iPhone 16, iPhone 16 Plus, iPhone 16 Pro and the iPhone 16 Pro Max through the Tom’s Guide battery test, which involves surfing the web over 5G at 150 nits of screen brightness. The iPhone 16 Pro Max and iPhone 16 Plus have risen to the top with some incredibly impressive results — making our best phone battery life list in the process. Here’s how the new iPhone 16 models’ battery life stacks up against their iPhone 15 counterparts, and rival flagships.
Tom Dotan and Berber Jin, reporting late last night for The Wall Street Journal (News+):
Apple is no longer in talks to participate in an OpenAI funding round expected to raise as much as $6.5 billion, an 11th hour end to what would have been a rare investment by the iPhone maker in another major Silicon Valley company. Apple recently fell out of the talks to join the round, which is slated to close next week, according to a knowledgeable person.
I just observed the other day that the tumultuous (to say the least) leadership situation at OpenAI seems incongruous with Apple’s.
Also surely related, to some degree, is this report on OpenAI’s financials that dropped yesterday from Mike Isaac and Erin Griffith at The New York Times:
OpenAI’s monthly revenue hit $300 million in August, up 1,700 percent since the beginning of 2023, and the company expects about $3.7 billion in annual sales this year, according to financial documents reviewed by The New York Times. OpenAI estimates that its revenue will balloon to $11.6 billion next year.
But it expects to lose roughly $5 billion this year after paying for costs related to running its services and other expenses like employee salaries and office rent, according to an analysis by a financial professional who has also reviewed the documents. Those numbers do not include paying out equity-based compensation to employees, among several large expenses not fully explained in the documents.
OpenAI: We lose a little on every sale but we make it up in volume.
iA, which has been shipping a version of iA Writer for Android for 7 years:
By September, we thought we had honored our side of the new agreement. But on the very day we expected to get our access back, Google altered the deal.
We were told that read-only access to Google Drive would suit our writing app better than the desired read/write access. That’s right — read-only for a writing app.
When we pointed out that this was not what we had, or what our users wanted, Google seemed to alter the deal yet again. In order to get our users full access to their Google Drive on their devices, we now needed to pass a yearly CASA (Cloud Application Security Assessment) audit. This requires hiring a third-party vendor like KPMG.
The cost, including all internal hours, amounts to about one to two months of revenue that we would have to pay to one of Google’s corporate amigos. An indie company handing over a month’s worth of revenue to a “Big Four” firm like KPMG for a pretty much meaningless scan. And, of course, this would be a recurring annual expense. More cash for Google’s partners, while small developers like us foot the bill for Android’s deeply ingrained security shortcomings.
Developing serious productivity apps for Android sounds like fun. (See also the footnote on how stunningly rampant piracy is on Android, too.)
Elizabeth Lopatto, reporting for The Verge:
X is preventing users from posting links to a newsletter containing a hacked document that’s alleged to be the Trump campaign’s research into vice presidential candidate JD Vance. The journalist who wrote the newsletter, Ken Klippenstein, has been suspended from the platform. Searches for posts containing a link to the newsletter turn up nothing.
Posting this just in case there remained an iota of a thought in your head that Elon Musk is actually a radical “free speech” absolutist and not just someone who blew $44 billion buying Twitter to warp the entire platform in the direction of his own weird un-American political agenda.
I’ll link first to The Verge’s “Everything Announced at Meta Connect 2024” roundup because Meta still hasn’t posted today’s keynote address on YouTube; best I’ve found is this recording of the livestream, starting around the 43m:20s mark. I watched most of the keynote live and found it engaging. Just 45 minutes long — partly because it was information dense, and partly because Mark Zuckerberg hosted the entire thing himself. He seems very comfortable, confident, and natural lately. Nothing slows an on-stage keynote down more than a parade of VPs. There was clearly no political infighting at Meta for stage time in this keynote. The keynote was Zuck’s, and because of that, it was punchy and brisk.
In terms of actual products that will actually ship, Meta announced the $300 Quest 3S. That’s more than an entire order of magnitude lower-priced than Vision Pro. Vision Pro might be more than 10× more capable than Quest 3S, but I’m not sure it’s 10× better for just playing games and watching movies, which might be the only things people want to do with headsets at the moment. They also launched a 7,500-unit limited edition of their $430 actually-somewhat-popular Ray-Ban Wayfarer “smart” glasses made with translucent plastic, iMac-style. It’s been a while since someone made a “look at the insides” consumer device. That’s fun, and a little quirky, too.
The big reveal was Orion, a working prototype of see-through AR glasses. Meta themselves are describing them as a “dev kit”, but not only are they not available for purchase, they’re not available, period. They’re internal prototypes for Meta’s own developers, not outside developers. They do seem interesting, for a demo, and I’m hearing from our Dithering correspondent on the scene in Menlo Park that using them is genuinely compelling. There can be no argument that actual glasses are the form factor for AR.
The Verge’s Alex Heath opened his piece on Orion today with this line:
They look almost like a normal pair of glasses.
That’s stretching the meaning of “almost” to a breaking point. I’d say they look vaguely kinda-sorta like a pair of normal glasses. Both the frames (super chunky) and the lenses (thick, prismatic, at times glowing) are conspicuous. They look orthopedic, like glasses intended to aid people whose vision is so low they’re legally blind. It really is true that Meta’s Ray-Ban Wayfarers are nearly indistinguishable from just plain Wayfarers. Orion isn’t like that at all. If you went out in public with these — which you can’t, because they’re internal prototypes — everyone would notice that you’re wearing some sort of tech glasses, or perhaps think you walked out of a movie theater without returning the 3D goggles. But: you could wear them in public if you wanted to, and unlike going out in public wearing a VR headset, you’d just look like a nerd, not a jackass. They’re close to something. But how close to something that would actually matter, especially price-wise, is debatable. From Heath’s report:
As Meta’s executives retell it, the decision to shelve Orion mostly came down to the device’s astronomical cost to build, which is in the ballpark of $10,000 per unit. Most of that cost is due to how difficult and expensive it is to reliably manufacture the silicon carbide lenses. When it started designing Orion, Meta expected the material to become more commonly used across the industry and therefore cheaper, but that didn’t happen.
“You can’t imagine how horrible the yields are,” says Meta CTO Andrew Bosworth of the lenses. Instead, the company pivoted to making about 1,000 pairs of the Orion glasses for internal development and external demos.
Snap recently unveiled their new Spectacles, which they’re leasing, not selling, for $1,200 per year. Snap’s Spectacles are so chunky they make Orion look inconspicuous in comparison. But the race to bring AR glasses to market is clearly on.
See Also: Heath’s interview with Zuckerberg for Decoder.
Next-Day Addendum: I woke up this morning with the following competitive back-and-forth in my head:
It’s a lot of back-and-forth volleying, which is what makes the early years of a new device frontier exciting and highly uncertain. Big bold ideas get tried out, and most of them wind up as dead ends to abandon. Compare and contrast to where we’ve been with laptops for the last 20 years, or the pinnacle we appear to have reached in recent years with phones. ★
A swing-and-a-miss from MKBHD. Criticism of the app is on two separate levels, but they’re being conflated. Level 1: the app is not good. Level 2: a paid wallpaper app? — LOL, wallpapers are free on Reddit. That second form of criticism — that there shouldn’t even exist a paid wallpaper app — is annoying and depressing, and speaks to how many people do not view original art as worth paying for. But it also speaks to the breadth of Brownlee’s audience, which I think is more tech-focused than design-focused. Scott Smith, on Mastodon, observed:
It’s really interesting to compare the reaction from the indie iOS community of @Iconfactory’s Wallaroo to the mainstream tech community’s reaction to @mkbhd’s Panels. I know they are not the same by any means but it sheds light on how many people in mainstream tech circles are still flabbergasted at paying for artwork.
So there’s that, and it is what it is. To some extent that freeloading cheapskate perspective can be ignored. If one’s argument is that all wallpapers ought to be free, that’s not a valid starting point for criticism of a paid wallpaper app/service.
The problem is, Panels is not a good app:
It crashed on me during first run on iPhone.
The UI is big and bulbous, and while it looks almost the same on iOS and Android (which is probably why it’s so crude), it looks native on neither platform. It looks and feels more like the interface to a game than an app. If anything, it looks and feels more Android-y than iOS-y, if only because “doesn’t really look native anywhere” is more of an Android thing. If Brownlee is down with how this app looks and feels, it explains quite a bit (more) about how he’s willing to spend large stretches of time daily-driving Android phones.
Totally subjective, but I don’t think the wallpapers themselves are good. I mean like none of them. They feel like user-generated content, not professional content curated by a trusted tastemaker like Brownlee.
The app has a crummy privacy report card, including using your general location for tracking, and on iOS brings up the “Ask App Not to Track” dialog. It’s even worse on Android. Not premium. (Panels doesn’t ask for GPS location access, but it uses your IP address for tracking, which Apple classifies as “location”. Apple ought to clarify that distinction in App Store privacy report cards — asking for GPS is not the same thing at all as IP-based geolocation — but it’s a bad look for the app.)
“SD” (1080p) wallpapers are free to download from some creators but require watching a minute or two of video ads. Not premium.
Subscribing costs $50/year or $12/month ($144 a year!) — which are, to say the least, premium prices. (Wallaroo is a far better app with — subjectively — far better wallpapers and costs $20/year or $2/month.)
It’s entirely plausible for a premium wallpaper app to justify a price of $50/year. But Panels isn’t a premium app. Premium apps don’t ask to track you across apps. Premium apps don’t make you watch ads to get to their free content. Premium apps look and feel native to their platforms. Premium apps don’t have sketchy privacy report cards. As it stands, Panels is incongruous and incoherent. ★
One of the many memorable moments in Steve Jobs’s 2007 introduction of the original iPhone was this slide showing four of the then-leading smartphones on the market. Jobs explained:
Now, why do we need a revolutionary user interface? Here’s four smartphones, right? Motorola Q, the BlackBerry, Palm Treo, Nokia E62 — the usual suspects. And, what’s wrong with their user interfaces? Well, the problem with them is really sort of in the bottom 40 there. It’s this stuff right there. They all have these keyboards that are there whether or not you need them to be there. And they all have these control buttons that are fixed in plastic and are the same for every application. Well, every application wants a slightly different user interface, a slightly optimized set of buttons, just for it.
And what happens if you think of a great idea six months from now? You can’t run around and add a button to these things. They’re already shipped. So what do you do? It doesn’t work because the buttons and the controls can’t change. They can’t change for each application, and they can’t change down the road if you think of another great idea you want to add to this product.
Well, how do you solve this? Hmm. It turns out, we have solved it. We solved it in computers 20 years ago. We solved it with a bitmapped screen that could display anything we want. Put any user interface up. And a pointing device. We solved it with the mouse. We solved this problem. So how are we going to take this to a mobile device? What we’re going to do is get rid of all these buttons and just make a giant screen. A giant screen.
At the time, what seemed most radical was eschewing a hardware QWERTY keyboard and instead implementing a touchscreen keyboard in software. Steve Ballmer, then CEO of Microsoft, in the infamous clip in which he laughed uproariously after being asked for his reaction to seeing the iPhone: “500 dollars? Fully subsidized, with a plan? I said, that is the most expensive phone in the world, and it doesn’t appeal to business customers because it doesn’t have a keyboard, which makes it not a very good email machine.”
Apple didn’t get rid of all the buttons, of course. But the buttons they kept were all for the system, the device, not for any specific application: power, volume, a mute switch (that, oddly, was copied by almost no competitors), and the lone button on the front face: Home.1 That’s it.
When Apple’s competitors stopped laughing at the iPhone and started copying it, they got rid of their hardware keyboards — theretofore the primary signifier differentiating a “smartphone” from a regular phone — but they couldn’t bring themselves to eliminate the not one but two dedicated hardware buttons that, to their unimaginative minds, were inherent to making any cell phone a phone: the green “call” and red “hang up” buttons. Android phones had those red/green buttons. The BlackBerry Storm had them too. Every phone but the iPhone had them. Until they caught up and realized those buttons were obviated too.
The thinking might have been rooted in the very name of the devices. Of course all phones — dumb phones, BlackBerry-style hardware-keyboard phones, iPhone-style touchscreen phones — ought to have phone buttons. I suspect they pondered very deeply how Apple was bold enough to eschew a hardware keyboard for an all-touchscreen design, but that they thought Apple was just taking minimalism to its extreme by eschewing green/red hardware call buttons. No matter how many other things they do, they’re phones first — it’s right there in their name!
But the iPhone has never really been fundamentally a telephone. On the iPhone, the Phone was always just another app. A special app, no question. Default placement in the Dock at the bottom of the Home Screen. Special background privileges within an otherwise highly constrained OS where most apps effectively quit when you’d go back to the Home Screen. Incoming phone calls instantly took over the entire screen. Jobs spent a lot of time in that introduction demonstrating the Phone app — including Visual Voicemail, a genuine breakthrough feature that required AT&T/Cingular’s cooperation on the back end.2
But, still, the Phone part of iPhone was then and remains now just an app. If you compared an iPhone to an iPod Touch, there was nothing on the iPhone hardware that indicated it was any more of a phone than the iPod Touch, which not only wasn’t a phone but didn’t even offer cellular networking. No buttons, for sure. No stick-out antenna. No carrier logo on the device. Look at a modern iPhone and there’s really only one function whose purpose is clearly visible from a conspicuous hardware protrusion: the camera lenses. Five years ago, in the lede of my review of the iPhones 11, I wrote, “A few weeks ago on my podcast, speculating on the tentpole features for this year’s new iPhones, I said that ‘iCamera’ would be a far more apt name than ‘iPhone’.”
What more proof of the camera’s singular importance to the iPhone would one need than the ever-growing block of camera lenses on the back of each year’s new models, or the “Shot on iPhone” ad campaign — the longest-running (and still ongoing) campaign in Apple’s history? A dedicated hardware button?
The facile take is that Apple has run out of hardware ideas and now just adds a new button to the iPhone each year — Action button last year, Camera Control this year, maybe they’ll finally add those green/red phone call buttons next year. But that’s underestimating just how radical it is for Apple, in the iPhone’s 18th annual hardware iteration, to add a hardware button dedicated to a single application.
And I mean application there in the general sense, not just the app sense. By default, of course, pressing Camera Control launches the iOS Camera app,3 but while setting up any new iPhone 16, Apple’s own onboarding screen describes its purpose as launching “a camera app”, with a lowercase c. Any third-party app that adopts new APIs and guidelines can serve as the camera app that gets launched (and, once launched, controlled) by Camera Control. (I’ll default to writing about using the system Camera app, though.)
Apple seemingly doesn’t ever refer to Camera Control as a “button”, but it is a button. You can see it depress, and it clicks even when the device is powered off (unlike, say, the haptic Touch ID Home button on iPhones of yore and the long-in-the-tooth iPhone SE). But it isn’t only a button. You can think of it as two physical controls in one: a miniature haptic slider (like a trackpad with only one axis) and an actually-clicking button.
When the Camera app is not already in shoot mode (whether your iPhone is on the Lock Screen or if another app is active — or even if you’re doing something else inside the Camera app other than shooting, like, say, reviewing existing photos):
When the Camera app is active and ready to shoot:
Just writing that all out makes it sound complicated, and it is a bit complex. (Here’s Apple’s own illustrated guide to using Camera Control.) Cameras are complex. But if you just mash it down, it takes a picture. Camera Control is like a microcosm of the Camera app itself. Just want to point and shoot? Easy. Want to fiddle with ƒ-stops and styles? There’s a thoughtful UI to do that. In the early years of iPhone, Apple’s Camera app was truly point-and-shoot simplistic. The shooting interface had just a few buttons: a shutter, a photo/video toggle, a control for the flash, and a toggle for switching to the front-facing camera. The original iPhone and iPhone 3G didn’t even support video, and the front-facing camera didn’t arrive until the iPhone 4. Those old iPhones had simple camera hardware, and the app reflected that simplicity.
Apple’s modern camera hardware has become remarkably sophisticated, and the Camera app has too. But if you just want to shoot what you see in the viewfinder, it’s as simple as ever. Pinch to zoom, tap to focus, press the shutter button to shoot. But so many other controls and options are there, readily available and intelligently presented for those who want them, easily ignored by those who don’t. Apple’s Camera app is one of the best — and best-designed — pieces of software the world has ever seen. It’s arguably the most-copied interface the world has ever seen, too. You’d be hard-pressed to find a single premium Android phone whose built-in camera app doesn’t look like Apple’s, usually right down to the yellow accent color for text labels.
After over a week using several iPhone 16 review units, my summary of Camera Control is that it takes a while to get used to — I feel like I’m still getting used to it — but it already feels like something I wouldn’t want to do without. It’s a great idea, and a bold one. As I emphasized above, only in the 18th hardware revision has Apple added a hardware control dedicated to a single application. I don’t expect Apple to do it again. I do expect Apple’s rivals to copy Camera Control shamelessly.
At first, though, I was frustrated by the physical placement of Camera Control. As a hobbyist photographer who has been shooting with dedicated cameras all the way back to the late 1990s, my right index finger expects a shutter button to be located near the top right corner. But the center of Camera Control is 2 inches (5 cm) from the corner. I’ll never stop wishing for it to be closer to the corner, but after a week I’ve grown acclimated to its actual placement. And I get it. I’m old enough that I shoot all of my videos and most of my photos in widescreen orientation. But social media today is dominated by tallscreen video. As Apple’s Piyush Pratik explained during last week’s keynote, Camera Control is designed to be used in both wide (landscape) and tall (portrait) orientations. Moving it more toward the corner, where my finger wants it to be, would make it better for shooting widescreen, but would make it downright precarious to hold the iPhone while shooting tall. I hate to admit it but I think Apple got the placement right. Shooting tallscreen is just way too popular. And, after just a week, my index finger is getting more and more accustomed to its placement. It might prove to be a bit of a reach for people with small hands, though.
I’ve also been a bit frustrated by using Camera Control to launch Camera while my iPhone is locked. With the default settings, when your iPhone is unlocked, or locked but with the screen awake, a single click of Camera Control takes you right to shooting mode in the Camera app. That sounds obvious, and it is. But, when your iPhone is locked and the screen is off, or in always-on mode, clicking Camera Control just wakes up the screen. You have to click it again, after the screen is awake, to jump to shooting mode. Apple’s thinking here is obvious: they want to prevent an accidental click of Camera Control while it’s in your pocket or purse from opening Camera. Unlike almost every other mode you can get into on an iPhone, when you’re in shooting mode in Camera, the device won’t go to sleep automatically after a minute or two of inactivity. The current default in iOS 18, in fact, is to auto-lock after just 30 seconds. (Settings → Display & Brightness → Auto-Lock.) In shooting mode, the Camera app will stay open for a long time before going to sleep. You don’t want that to happen inadvertently while your iPhone is in your pocket.
But what I’ve encountered over the last week are situations where my iPhone is in my pocket, and I see something fleeting I want to shoot. This happened repeatedly during a Weezer concert my wife and I attended last Friday. (Great show.) What I want is to click Camera Control while taking the iPhone out of my pocket, and have it ready to shoot by the time I have it in front of my eyes. That’s how the on/off button works on dedicated cameras like my Ricoh GR IIIx. But with an iPhone 16, more often than not, the single click of Camera Control while taking the iPhone out of my pocket has only awakened the screen, not put it into shooting mode. I need to click it again to get into shooting mode. With a fleeting moment, that’s enough to miss the shot you wanted to take. The whole point of this is being a quick-draw gunslinger.
Apple offers a more-protective option in Settings → Camera → Camera Control → Launch Camera to require a double click, rather than single click, to launch your specified camera app. As I write this, I wish that they also offered a less-protective option to always launch your camera app on a single click, even if the phone is locked and the screen is off. A sort of “I’ll take my chances with accidental clicks” option. It’s possible though that Apple tried this, and found that inadvertent clicks are just too common. But as it stands, there’s no great way to always jump into shooting mode as quickly as possible.
When the iPhone is locked and the screen is off, a double click of Camera Control will jump you into shooting mode. I started doing that over the weekend, and at first I thought it satisfied my desire. But the problem with that is that if the iPhone is locked but the screen is already awake, a double click on Camera Control will jump into Camera on the first click and snap a photo with the second. I’ve had to delete at least half a dozen blurry accidental shots because of that.
A gesture that would avoid accidental invocations is clicking-and-holding the Camera Control button. In theory Apple could offer that as a surefire way to launch Camera while taking your iPhone out of your pocket. But Apple has reserved the click-and-hold gesture for visual intelligence, a new Apple Intelligence feature announced last week. That’s the feature that will put the nail in the coffin of dedicated “AI” devices like Humane’s AI Pin and Rabbit’s R1. Visual intelligence isn’t yet available, even in the developer betas of iOS 18.1, but the click-and-hold gesture is already reserved for it.4
So where I’ve landed, at this writing, is trying to remember only to double-click Camera Control while taking my iPhone out of my pocket to shoot, and just sucking it up with the occasional blurry unwanted shot when I double-click Camera Control when the screen is already awake. The only other technique I can think to try is to remember to always wait until I see that the screen is awake before clicking Camera Control, tilting the phone if necessary to wake it, but that would seemingly defeat the purpose of getting into shooting mode as quickly as possible.
By default, if you light-press-and-hold on Camera Control, nearly all of the UI elements disappear from the viewfinder screen. The shooting mode picker (Cinematic, Video, Photo, Portrait, Spatial, etc.), the zoom buttons (0.5×, 1×, 2×, 5×), the front/rear camera toggle, the thumbnail of your most recent photo — all of that disappears from the screen, leaving it about as uncluttered as the original iPhone Camera interface. Think of it as a half-press while using Camera Control as a shutter button. Dedicated hardware cameras have, for many decades, offered two-stage shutter buttons that work similarly. With those dedicated cameras, you press halfway to lock in a focus distance and exposure; then you can move the camera to recompose the frame while keeping the focus distance and exposure locked, before pressing fully to capture the image. Apple has promised to bring this feature to the Camera app for all iPhone 16 models in a software update “later this year”. (It’s not there yet in iOS 18.1 beta 4.) Camera Control does not have quite the same precise feel as a true two-stage shutter button that physically clicks at two separate points of depression (two detents), but it might eventually, in future iPhone models.
One issue with Camera Control is that because it’s capacitive, it’s tricky for case makers. The obvious solution is to just put a cutout around it, letting the user’s finger touch the actual Camera Control button. Apple’s more elegant solution, on their own silicone and clear cases and the new glossy polycarbonate cases from their subsidiary Beats, is “a sapphire crystal, coupled to a conductive layer to communicate the finger movements to the Camera Control”. That doesn’t sound like something you’re going to see in cheap $20 cases. In my testing, both with Apple’s cases and Beats’s, it works fairly seamlessly. I do think you lose some of the feel from the haptic feedback on light presses, though. Ultimately, Camera Control makes it more true than ever before that the best way to use an iPhone is without a case.
One more thing on Camera Control. Of the features that are adjustable via Camera Control (again: Exposure, Depth (ƒ-stop), Zoom, Cameras, Style, Tone), “Cameras” is an easily overlooked standout. Zoom offers continuous decimal increments from 0.5× to 25.0×. That is to say, you can slide your finger to get zoom values like 1.7×, 3.3×, 17.4×, etc. I almost never want that. I want to stick to the precise true optical increments: 0.5×, 1×, 2×, and 5×. That’s what the “Cameras” setting mode offers. Think of it as Zoom, but only with those precise values. (Instead of “Cameras”, this setting could have been called “Lenses”, but that’s potentially confusing because 1× and 2× both come from the same physical lens; the difference is how the sensor data is treated.) In fact, I wish I could go into Settings and disable Zoom from the list of features available in Camera Control. If I ever really want a non-optical zoom level, I can use the existing on-screen interface options.
What’s obvious is that Camera Control clearly was conceived of, designed, and engineered by photography aficionados within Apple who are intimately familiar with how great dedicated cameras work and feel. It surely must have been championed, politically, by the same group. It’s really just rather astounding that there is now a hardware control dedicated to photography on all new iPhones — and a mechanically complex control at that.
As usual, I’ll leave it to other reviewers to do in-depth pixel-peeping comparisons of image quality, but suffice it to say, to my eyes, the iPhone 16 Pro (the review unit I’ve been daily driving this past week) camera seems as great as usual.
The big new photographic feature this year has nothing to do with lenses or sensors. It’s a next-generation Photographic Styles, and it’s effectively “RAW for the rest of us”. This has always been the edge of my personal photographic nerdery/enthusiasm. I care enough about photography to have purchased numerous thousand-ish dollar cameras (and lenses) over the decades, but shooting RAW has never stuck for me. I understand what it is, and why it is technically superior to shooting JPEG/HEIC, but it’s just too much work. RAW lets you achieve better results through manual development in post, but you have to develop in post because raw RAW images (sorry) look strikingly flat and unsaturated. For a while I tried shooting RAW + JPEG, where each image you take is stored both as a straight-off-the-sensor RAW file and a goes-through-the-camera-imaging-pipeline JPEG file, but it turned out I never ever went back and developed those RAW images. And relative to JPEG/HEIC (which, henceforth, I’m just going to call “shooting JPEG” for brevity, even though iPhones have defaulted to the more-efficient HEIC format since iOS 11 seven years ago), RAW images take up 10× (or more) storage space.
It’s just too much hassle. The increase in image quality I can eke out developing RAW just isn’t worth the effort it takes — for me. For many serious photographers, it is. Everyone has a line like that. Some people don’t do any editing at all. They never crop, never change exposure in post, never apply filters — they just point and shoot and they’re done. For me, that line is shooting RAW.
Apple first introduced Photographic Styles with the iPhones 13 three years ago, with four self-descriptive primary styles: Rich Contrast (my choice), Vibrant, Warm, and Cool. Each primary style offered customization. Find a style you like, set it as your default, and go about your merry way. But whatever style you chose was how your photos were “developed” by the iPhone hardware imaging pipeline. Apple’s “filters” have been non-destructive for years, but the first generation of Photographic Styles are baked into the HEIC files it writes to storage.
With the iPhone 16 lineup, this feature is now significantly more powerful, while remaining just as convenient and easy to use.5 Apple eliminated what used to be called “filters” and recreated the better ones (e.g. Vibrant and Dramatic) as styles. There are now 15 base styles to choose from, most of them self-descriptively named (Neutral, Gold, Rose Gold), some more poetically named (Cozy, Quiet, Ethereal). The default style is named Standard, and it processes images in a way that looks, well, iPhone-y. The two that have me enamored thus far are Natural and Stark B&W. Standard iPhone image processing has long looked, to many of our eyes, at least slightly over-processed. Too much noise reduction, too much smoothing. A little too punchy. Natural really does look more natural, in a good way, to my eyes. Stark B&W brings to mind classic high-contrast black-and-white films like Kodak Tri-X 400.
A key aspect of Photographic Styles now is that they’re non-destructive. You can change your mind about any of it in post. Set your default to Stark B&W and later on, editing in Photos, you can change your mind and go back to a full-color image using whichever other style you want. There’s a lot of complex image processing going on behind the scenes — both in the iPhone 16 hardware and iOS 18 software — to make this seem like no big deal at all. But because the new Photographic Styles are largely (or entirely?) based on the hardware imaging pipeline, iPhones 13–15 will continue to use the first-generation Photographic Styles, even after upgrading to iOS 18.
I’ve always felt a little guilty about the fact that I’m too lazy to shoot RAW. This next-generation Photographic Styles feature in the iPhone 16 lineup might assuage, I suspect, the remaining vestiges of that guilt.
Apple kindly supplied me with all four models in the iPhone 16 lineup for review: the 16 in ultramarine, 16 Plus in pink, 16 Pro in natural titanium, and 16 Pro Max in desert titanium. Ultramarine is my favorite color color on any iPhone in memory. It’s a fun poppy blue, and quite vibrant. Pink is good too, with to my (and my wife’s) eyes, a touch of purple to it. The colors are extra saturated on the camera bumps, which looks great. Natural titanium looks extremely similar, if not identical, to the natural titanium on last year’s iPhone 15 Pro. (Apple’s own Compare page makes it appear as though this year’s natural titanium is noticeably lighter than last year’s, but here’s a photo from me showing a natural 15 Pro Max and 16 Pro side-by-side.) Desert titanium is sort of more gold than tan, but there is some brown to it, without rendering it the least bit Zune-like.
In short, the regular iPhone 16 offers some colors that truly pop. The iPhone 16 Pro models remain, as with all previous “Pro” iPhone colorways, staid shades of gray. White-ish gray, gray gray, near-black gray, and now desert gray.
I always buy black, or the closest to black Apple offers, and this tweet I wrote back in 2009 remains true, so the only year I’ve ever had a “which color to buy?” personal dilemma was 2016 with the iPhones 7, which Apple offered in both a matte “black” and Vader-like glossy “jet black”.6 I still kind of can’t believe Apple offered two utterly different blacks in the same model year.
But “which model to buy?” is sometimes more of a dilemma for yours truly. In 2020 I bought a regular iPhone 12, not the 12 Pro, on the grounds that it weighed less and felt better in hand than the Pro models. Whatever the non-pro iPhone 12 lacked in photographic capabilities wouldn’t matter so much, I correctly guessed, while I remained mostly homebound during the COVID epidemic. But I was also tempted, sorely, by the 12 Mini, and in hindsight I really don’t remember why that’s not the model I bought that year.
It’s a good thing, and a sign of strength for Apple, when the regular iPhone models are extremely appealing even to power users. It seemed like an artificial restriction last year, for example, that only the 15 Pro model got the new Action button. The year prior, only the 14 Pro models got the Dynamic Island; the regular iPhone 14 models were stuck with a no-fun notch. If you’re fairly deep into the weeds regarding TSMC’s first-generation 3nm fabrication, it makes sense why only the iPhone 15 Pro models got a new chip (the A17 Pro — there was no regular A17) while the iPhone 15 models stayed on the year-old A16, but still, that was a bummer too. This year, the regular 16 and 16 Plus not only get the Action button, they get the new Camera Control too (which, as I opined above, would make more sense as a “pro” feature than the Action button did last year), and a new A18 chip fabricated with TSMC’s second-generation 3nm process.
For my own use I’ve preordered an iPhone 16 Pro. But for the first time since the aforementioned iPhone 12 in 2020, I was genuinely tempted by the regular iPhone 16. The biggest functional difference between the 16 and 16 Pro models is that only the 16 Pros have a third telephoto lens. Last year, the 15 Pro Max went to 5×, but the 15 Pro remained at 3×. This year, both the 16 Pro and 16 Pro Max have the 5× telephoto lens. I tend to think I seldom use the telephoto lens, but it turns out I used it a little more in the last year than I would have guessed. Using smart albums in Photos to sort images by camera and lens, it looks like out of 3,890 total photos I shot with my iPhone 15 Pro, the breakdown by camera lens went like this:
Camera | Optical Zoom | Photos | Percentage |
---|---|---|---|
Ultrawide | 0.5× | 338 | 9% |
Main | 1×/2× | 3,076 | 79% |
Telephoto | 3× | 476 | 12% |
And, eyeballing the photos in that telephoto lens smart album, for most of them, I could have used a little more reach. I don’t expect to use 5× more often than I used 3×, but I expect to get better shots when I do. But it’s also the case that a fair number of the photos in that telephoto smart album are shots I just don’t care about that much. I do use the telephoto lens, and I look forward to having a 5× one instead of 3×, but I could live without it entirely and not miss it too much. (I only have 8 videos shot using 3× from the last year. Longer lenses are not good focal lengths for handheld video.)
Aesthetically, the two-lens arrangement on the back of the iPhones 16 and 16 Plus is far more pleasing than the three-lens triangle-in-a-square arrangement on the iPhones 16 Pro and 16 Pro Max.
For the last few years (the iPhone 13, 14, and 15 generations), the aesthetic difference in the back camera systems hasn’t been so striking, because Apple placed the non-pro iPhones’ two lenses in a diagonal arrangement inside a square block. The two lenses on the backs of the iPhones 11 and 12 were aligned on the same axis (vertical, when holding the phone in tallscreen orientation), but they were still inside a raised square. You’d have to go back to 2018’s iPhone XS to find a two-lens iPhone with the iPhone 16’s pleasing pill-shaped bump.
Either you care about such purely aesthetic concerns or you don’t. I care. Not enough to purchase an iPhone 16 instead of a 16 Pro, but it was a factor. The iPhone 16 and 16 Plus simply look more pleasing from the back and feel better in hand, especially caseless, than any iPhone since 2018.
Here’s the pricing for the entire iPhone 16 lineup:
Model | 128 GB | 256 GB | 512 GB | 1 TB |
---|---|---|---|---|
16 | $800 | $900 | $1,100 | — |
16 Plus | $900 | $1,000 | $1,200 | — |
16 Pro | $1,000 | $1,100 | $1,300 | $1,500 |
16 Pro Max | — | $1,200 | $1,400 | $1,600 |
But perhaps a better way to compare is by size class. Regular size:
Model | 128 GB | 256 GB | 512 GB | 1 TB |
---|---|---|---|---|
16 | $800 | $900 | $1,100 | — |
16 Pro | $1,000 | $1,100 | $1,300 | $1,500 |
And big-ass size:
Model | 128 GB | 256 GB | 512 GB | 1 TB |
---|---|---|---|---|
16 Plus | $900 | $1,000 | $1,200 | — |
16 Pro Max | — | $1,200 | $1,400 | $1,600 |
At both size classes, it’s a $200 delta to go from the regular model to its Pro equivalent. Looking at Apple’s excellent-as-always Compare page, here are the advantages/exclusive features that jump out to me for the 16 Pro models, other than the extra telephoto camera lens, roughly in the order in which I personally care:
I think it’s amazing that the iPhone Pro models are now able to shoot professional-caliber video. But I don’t shoot video professionally. And because I don’t, I can’t remember the last time I needed to transfer data from my iPhone via the USB-C port, so, while the Pro models offer a noticeable advantage in USB performance, I might never use it personally over the next year.
Another difference is that the 16 Pro models have slightly bigger displays than the regular 16 models. The 16 Pro and 16 Pro Max are 6.3 and 6.9 inches; the regular 16 and 16 Plus are 6.1 and 6.7. Whether that’s actually an advantage for the Pro models depends on whether you care that they’re also slightly taller and heavier devices in hand.
I omitted from the above comparison the one spec people care most about: battery life. Here is the sleeper spec where the Pro models earn their keep. Once again grouping like-vs.-like size classes, and including the 15 Pro models for year-over-year comparison:
Model | Video | Video (streamed) |
---|---|---|
15 Pro | 23 hours | 20 hours |
16 | 22 hours | 18 hours |
16 Pro | 27 hours | 22 hours |
15 Pro Max | 29 hours | 25 hours |
16 Plus | 27 hours | 24 hours |
16 Pro Max | 33 hours | 29 hours |
Those battery life numbers come from Apple, not my own testing (and Apple cites them as “up to” numbers). But those numbers suggest 20 percent longer battery life on the 16 Pro models compared to their size-class non-pro counterparts. Anecdotally, that feels true to me. I use a Shortcuts automation to turn on Low Power mode whenever my iPhone battery level drops below 35 percent. With my iPhone 15 Pro, that generally happens every night at some point. Over the last week using the iPhone 16 Pro as my primary iPhone, it hasn’t dropped that low most nights. To say the least, that’s not a rigorous test in any way, shape, or form. But Apple has no history of exaggerating battery life claims, especially relative comparisons between devices. I think it’s the real deal, and the 16 Pro and 16 Pro Max probably get 20 percent longer battery life than their corresponding 16 and 16 Plus counterparts, and between 10–15 percent over last year’s Pro models, in practical day-to-day use.
That alone might be worth a big chunk of the $200 price difference to some people.
I spent the weekdays last week running iOS 18.0; on Friday afternoon, I upgraded my 16 Pro review unit to the developer beta of iOS 18.1 (beta 3 at the time, since upgraded to beta 4). I’m sure many, if not most reviewers, will review only what comes in the box, and what’s coming in the box this week will be iOS 18.0 without any Apple Intelligence features.
That stance is fair enough, but I don’t see it as a big deal to include my 18.1 experience in this review. iOS 18.1 feels pretty close to shipping. Apple has promised “October”, and my gut feeling, using it for the last five days on this review unit, is that it’s pretty solid. I suspect it might ship closer to early October than late October. But even if it doesn’t appear until Halloween, I don’t think it’s absurd or offensive that Apple is already using Apple Intelligence to market the iPhone 16 lineup. It’s a little awkward right now, but it’s not a sham. It’s vaporware until it actually ships, but it’s vaporware that anyone with a developer account can install right now.
Also, none of the Apple Intelligence features currently in iOS 18.1 are game-changing. The Clean Up feature in Photos is pretty good, and when it doesn’t produce good results, you can simply revert to the original. The AI-generated summaries of messages, notifications, and emails in Mail are at times apt, but at others not so much. I haven’t tried the Rewrite tool because I’m, let’s face it, pretty confident in my own writing ability. But, after my own final editing pass, I ran this entire review through the Proofread feature, and it correctly flagged seven mistakes I missed, and an eighth that I had marked, but had forgotten to fix. Most of its suggestions that I have chosen to ignore were, by the book, legitimate. (E.g., it suggested replacing the jargon-y lede with the standard spelling lead. It also flagged my stubborn capitalization of “MacOS”.) It took 1 minute, 45 seconds to complete the proofreading pass of the 7,200+ words in Apple Notes on the iPhone 16 Pro. (Subsequent to the original publication of this review, I tried the Rewrite function on the text of it, for shits and giggles, and the only way I can describe the results is that it gave up.)
New Siri definitely offers a cooler-looking visual interface. And the new Siri voices sound more natural. But it also feels like Siri is speaking too slowly, as though Siri hails from the Midwest or something. (Changing Siri’s speaking rate to 110 percent in Settings → Accessibility → Siri sounds much more natural to my ears, and feels like it matches old Siri’s speaking rate.) Type to Siri is definitely cool, but I don’t see why we couldn’t have had that feature since 2010. I have actually used the new “Product Knowledge” feature, where Siri draws upon knowledge from Apple’s own support documentation, while writing this review. It’s great. But maybe Apple’s support website should have had better search years ago?
These are all good features. But let’s say you never heard of LLMs or ChatGPT. And instead, at WWDC this year, without any overarching “Apple Intelligence” marketing umbrella, Apple had simply announced features like a new cool-looking Siri interface, typing rather than talking to Siri, being able to remove unwanted background objects from photos, a “proofreading” feature for the standard text system that extends and improves the years-old but (IMO) kinda lame grammar-checking feature on MacOS, and brings it to iOS too? Those would seem like totally normal features Apple might add this year. But not tentpole features. These Apple Intelligence features strike me as nothing more than the sort of nice little improvements Apple makes across its OSes every year.
Apple reiterated throughout last week’s “It’s Glowtime” keynote, and now in its advertising for the iPhone 16 lineup, that these are the first iPhones “built for Apple Intelligence from the ground up”. I’m not buying that. These are simply the second generation of iPhone models with enough RAM to run on-device LLMs. LLMs are breakthrough technology. But they’re breakthroughs at the implementation level. The technology is fascinating and important, but so are things like the Swift programming language. I spent the first half of my time testing the iPhone 16 Pro running iOS 18.0 and the second half running 18.1 with Apple Intelligence. A few things got a little nicer. That’s it.
I might be underselling how impossible the Clean Up feature would be without generative AI. I am very likely underselling how valuable the new writing tools might prove to people trying to write in a second language, or who simply aren’t capable of expressing themselves well in their first language. But like I said, they’re all good features. I just don’t see them as combining to form the collective tentpole that Apple is marketing “Apple Intelligence” as. I get it that from Apple’s perspective, engineering-wise, it’s like adding an entire platform to the existing OS. It’s a massive engineering effort and the on-device execution constraints are onerous. But from a user’s perspective, they’re just ... features. When’s the last year Apple has not added cool new features along the scope of these?
Apple’s just riding — and now, through the impressive might of its own advertising and marketing, contributing to — the AI hype wave, and I find that a little eye-roll inducing. It would have been cooler, in an understated breathe-on-your-fingernails-and-polish-them-on-your-shirt kind of way, if Apple had simply added these same new features across their OSes without the marketing emphasis being on the “Apple Intelligence” umbrella. If not for the AI hype wave the industry is currently caught in, this emphasis on which features are part of “Apple Intelligence” would seem as strange as Apple emphasizing, in advertisements, which apps are now built using SwiftUI.
If the iPhone 16 lineup was “built from the ground up” with a purpose in mind, it’s to serve as the best prosumer cameras ever made. Not to create cartoon images of a dog blowing out candles on a birthday cake. The new lineup of iPhones 16 are amazing devices. The non-pro iPhone 16 and 16 Plus arguably offer the best value-per-dollar of any iPhones Apple has ever made. This emphasis on Apple Intelligence distracts from that.
The problem isn’t that Apple is marketing Apple Intelligence a few weeks before it’s actually going to ship. It’s that few of these features are among the coolest or most interesting things about the new iPhone 16 lineup, and none are unique advantages that only Apple has the ability or inclination to offer.7 Every phone on the market will soon be able to generate impersonal saccharine passages of text and uncanny-valley images via LLMs. Only Apple has the talent and passion to create something as innovative and genuinely useful as Camera Control. ★
While I’m reminiscing, allow me to reiterate my belief that the icon on the iPhone Home button is the single greatest icon ever designed. In my 2017 review of the iPhone X, I wrote:
↩︎The fundamental premise of iOS Classic is that a running app gets the entire display, and the Home button is how you interact with the system to get out of the current app and into another. Before Touch ID, the Home button was even labeled with a generic empty “app” icon, an iconographic touch of brilliance. [...]
I find it hard to consider a world where that button was marked by an icon that looked like a house (the overwhelmingly common choice for a “home” icon) or printed with the word “HOME” (the way iPods had a “MENU” button). Early iPhone prototypes did, in fact, have a “MENU” label on the button.
I truly consider the iPhone Home button icon the single best icon ever. It perfectly represented anything and everything apps could be — it was iconic in every sense of the word.
It’s almost unfathomable how much of a pain in the ass voicemail was before the iPhone. Rather than manage messages on screen, you placed a phone call to your carrier and interfaced with their system by punching number buttons. You had to deal with each message sequentially. “Press 1 to play, 2 to go to the next message, 7 to delete.” And you had to actually listen to the messages to know who they were from. It was horrible. ↩︎︎
Unless, I suppose, you live in the EU and have exercised your hard-earned right to delete it. ↩︎︎
That’s the only way to launch visual intelligence, which means the feature is exclusive to the iPhone 16 lineup and won’t be available on iPhone 15 Pros. I’m truly looking forward to this feature, so that’s a bummer for iPhone 15 Pro owners. ↩︎︎
Here’s Apple’s brief documentation for the old Photographic Styles feature (iPhones 13, 14, 15) and the new version (iPhones 16). ↩︎︎
Jet black aluminum is back, and as Vader-esque as it was on the iPhone 7 in 2016, with a new colorway for the Apple Watch Series 10 this year. I have a review unit in jet black on my wrist and it’s so great. ↩︎︎
It’s fair to argue that Private Cloud Compute is uniquely Apple. Not that Apple is the only company that could build out such an infrastructure for guaranteed-private off-device AI processing, but among the few companies that could do it, Apple is the only one that cares so deeply about privacy that they would. I do not expect Private Cloud Compute to be replicated by Google, Samsung, Meta, Amazon, or Microsoft. Nor any of the AI startups like OpenAI or Anthropic. They simply don’t care enough to do it the hard way. Apple does. But that belongs in the marketing for Apple’s ongoing Privacy campaign, not for the iPhones 16 in particular. ↩︎︎