By John Gruber
WorkOS, the modern identity platform for B2B SaaS — free up to 1 million MAUs.
My thanks to WorkOS for, once again, sponsoring the week at Daring Fireball. WorkOS is a modern identity platform for B2B SaaS. Start selling to enterprise customers with just a few lines of code. Ship complex features like SSO and SCIM (pronounced skim) provisioning in minutes instead of months.
Today, some of the fastest growing startups are already powered by WorkOS, including Perplexity, Vercel, and Webflow.
For SaaS apps that care deeply about design and user experience, WorkOS is the perfect fit. From high-quality documentation to self-serve onboarding for your customers, it removes all the unnecessary complexity for your engineering team.
Another good overview of the Automattic/WP Engine saga, this one from Ari Levy at CNBC:
Mullenweg may be openly enthusiastic and grateful for the employees he still has on board, but the WordPress community is a mess. Many WP Engine customers are suffering, and Automattic is gearing up for a legal fight against a private equity firm with over $100 billion in assets.
Hard not to be reminded, somewhat, of the righteous indignation fueling Steve Jobs’s end of life crusade against Google for creating Android. Some big fundamental differences, of course. WordPress is GPL open source and iOS isn’t open at all. It’s the righteous fervor of the founder/leader of the company that’s reminiscent.
Emma Roth does the yeoman’s work of summarizing the complex and fast-moving legal feud between WordPress’s commercial arm and WP Engine, a major WordPress hosting provider:
Over the past several weeks, WordPress cofounder Matt Mullenweg has made one thing exceedingly clear: he’s in charge of WordPress’ future.
Mullenweg heads up WordPress.com and its parent company, Automattic. He owns the WordPress.org project, and he even leads the nonprofit foundation that controls the WordPress trademark. To the outside observer, these might appear to be independent organizations, all separately designed around the WordPress open-source project. But as he wages a battle against WP Engine, a third-party WordPress hosting service, Mullenweg has muddied the boundaries between three essential entities that lead a sprawling ecosystem powering almost half of the web.
To Mullenweg, that’s all fine — as long as it supports the health of WordPress long-term.
See also: Mullenweg’s “alignment” offer to Automattic’s nearly 1,900 employees.
Taegan Goddard, writing at Political Wire:
It’s worth recalling that a major reason Trump won in 2016 was that, just before the election, news broke about emails related to a closed investigation into Hillary Clinton’s emails being found on Anthony Weiner’s computer, the estranged husband of a top Clinton aide.
In the end, nothing came of this discovery, but the extensive news coverage of it almost certainly swayed the election. It was the top story in every major newspaper.
But this new evidence presented against Trump wasn’t even the lead story in the New York Times or Washington Post this morning. And it didn’t even make the front page of the Wall Street Journal or USA Today.
It’s true that millions of words have already been written about Trump’s attempt to overturn the 2020 election. But there was plenty of new information included in this filing which is directly relevant to the biggest news story this month.
This, I think, is entirely explained by the conventional wisdom that the U.S. news media is “liberal”, a decades-long work-the-refs strategy from Republicans. The truth is the news media is effectively in the tank for Trump, sanewashing his literal nonsense, outright lies, and violence-inspiring hate speech against even legal immigrants. But our major political news media remains so hyper-focused on appearing not to favor one political side over the other that it’s completely lost sight of what ought to be their north star: the truth. If the truth favors one party over the other, so be it. That’s the job of reporting the news.
The difference between how these same publications treated Hillary Clinton’s “but her emails” nonsense in 2016 compared to Jack Smith’s motion this week could not be more stark.
Update: If you prefer, imagine if a special counsel appointed by the Attorney General submitted a brief alleging any crimes at all committed by Kamala Harris. Let’s say personal tax evasion — crimes, but insignificant compared to multiple attempts to overthrow the results of the last presidential election. The major U.S. newspapers and cable channels would have covered nothing else in the days since. Yet for this brief laying out copious evidence that Trump attempted the worst crime imaginable against U.S. democracy itself, it’s relative crickets chirping and shoulder shrugs. Remember too that Trump is already a convicted felon. If Harris had been convicted of a felony this year, do you think it would mentioned more frequently in news stories than it actually is for Trump? If you don’t, I have a bridge to sell you.
I missed this announcement from MLB a month ago:
Major League Baseball today announced a new multi-year international partnership with European workwear leader STRAUSS that makes the German company the Official Workwear Partner of MLB. The partnership marks STRAUSS’ first league-wide deal in the United States. STRAUSS entered the U.S. market in late 2023, and American brand awareness is the cornerstone of its marketing efforts. The new partnership also affords STRAUSS marketing rights with MLB across Canada, Mexico and Europe. As part of the deal, STRAUSS’ name and logo will adorn MLB batting helmets during the Postseason and regular season games in Europe, as well as MiLB batting helmets all season long, beginning in 2025.
But I couldn’t miss it watching postseason games on TV this week: there’s a ridiculous-looking “Strauss” on both sides of every player’s batting helmet now, as prominent as the team logo on the front. It looks even more desperate and obsequious on the helmets than it does printed in all-caps in MLB’s bootlicking press release. This is the sort of gimmick you expect from a struggling independent minor league team, not Major League Baseball.
They should’ve put the rights to these on-helmet ads up for public auction. I’d have chipped in for a fan-backed initiative to buy that on-helmet ad space to affix this slogan: “FIRE ROB MANFRED”.
Victoria Gomelsky, reporting with absurd credulity for The New York Times:
Hodinkee, the watch enthusiast website based in Manhattan that has helped spread the gospel of mechanical watchmaking since its founding in 2008, has a new owner.
On Friday, the Watches of Switzerland Group, one of the world’s largest watch retailers with more than 220 multibrand and brand stores in Britain and the United States, announced that it had acquired the media company, which includes a website, a magazine, a brand partnerships division and an insurance business. Neither company would disclose the terms of the deal. [...]
Both Mr. Clymer and Mr. Hurley said Hodinkee’s staff, which now totals about 35 people, would remain intact and that its editorial team would remain independent of Watches of Switzerland oversight.
“But at a point in time,” Mr. Hurley said, “when you click on the Hodinkee Shop, you will see the full range of the product that WatchesofSwitzerland.com carries. We are going to do some work over the next several months to make that effectively seamless.”
There is a name for a publication that is owned by a retailer: catalog. I’d love to be proven wrong and see Hodinkee return to excellence, but that seemed far more likely as an independent website than as a subsidiary of the world’s largest premium watch retailer. For years I read Hodinkee daily; for the last few years I largely stopped reading it at all. Here’s Clymer’s own column announcing the acquisition (“joining forces”) and his return to day-to-day leadership of the site.
An important follow-up to yesterday’s item about Russia demanding Apple remove VPN apps from the Russian App Store: you can use a VPN on iOS without an app. It just requires some futzing in Settings and a VPN provider that supports it. Presumably, this technique remains available to iPhone users in Russia. Here are instructions from one such VPN provider, ForestVPN:
- Access Settings:
- Open the Settings app on your iPhone.
- Tap on General and scroll to VPN & Device Management.
- Add VPN Configuration:
- Select Add VPN Configuration.
- Choose your desired protocol, such as L2TP or IKEv2.
- Enter VPN Details:
- Fill in the necessary fields like Description, Server, Remote ID, and Local ID. These details can be found on the ForestVPN website.
- Save and Connect:
- Tap Done to save your configuration.
- Enable the VPN by toggling the switch next to your newly created profile.
VPN apps remove complexity from this process, but it’s worth noting that VPN access doesn’t require an app.
Chili Palmer, reporting for HighSpeedInternet:
Starlink announced on Oct. 2 it will offer one month of free internet in Hurricane Helene disaster areas. The free service will be available to new customers who order through the Starlink website and to customers who activate a kit they already have, whether it was donated or purchased at a retail store. Existing customers may also be eligible.
The announcement comes after more than 500 Starlink kits were distributed throughout the disaster area by private relief organizations.
It’s hard to overstate how differently Elon Musk would be perceived if he weren’t a whackjob on political and cultural issues.
Ryan Christoffel, writing for 9to5Mac:
Hurricane Helene has caused massive damage and taken over 100 lives across several US states. Many thousands of people are without power and/or cell service. But in the wake of the storm, reports have surfaced about a key iOS 18 feature that has been a lifeline for survivors: Messages via satellite.
Apple added Messages via satellite to millions of iPhones via its recent iOS 18 update. And now, according to reports on social media, it seems the feature arrived just in time. Here are a few tweets highlighting how useful the feature has proven.
It’s great that iOS 18 shipped before Helene hit, but a shame that it’s so new that most users haven’t yet upgraded. And once Helene hit and knocked out all comms in the most severely-hit areas, it was too late. (Apple hasn’t yet pushed iOS 18 to the majority of users whose devices are set to install updates automatically, and typically doesn’t do so with new iOS versions until the .1 release in October or November.) Some heads-up people were specifically recommending that iPhone 14 and 15 users in Helene’s path update to iOS 18 before it hit specifically to get this feature. But still: the feature is already making a huge difference.
Cool Hunting:
We love getting into the nerdy details of design innovations and the iPhone 16‘s new Camera Control button presented a perfect opportunity to dig in. For this first podcast of our new Design Tangents series aptly named Nerdy Details we sit down with Johnnie Manzari from the Apple Human Interface team and Rich Dinh, Senior Director of Product Design, to talk about cameras and photography through the lens of the new control on “the world’s most popular camera.”
You don’t often get to hear Apple employees speak about their work. When you do, it’s often largely about trying to get the feel right.
Zac Hall, 9to5Mac:
iPhone users are being notified about an excessive heat weather event through Apple’s Weather app on iPhone. While the weather event is happening in the Santa Clara Valley region of California, the alert says that the occurrence is happening in an area nearby regardless of where you live.
Hall had a good theory — that the warnings were being to delivered to people who live nowhere near Santa Clara Valley because Apple includes Cupertino as a default location for the Weather app — but in an update acknowledges that the warning notification is being received by users who don’t have any saved locations near the heat wave. (I’ve gotten the notification on multiple devices, and don’t have Cupertino saved as a Weather location.)
What a weird bug.
The United States Attorney’s Office for the District of Columbia:
Haotian Sun, 34, and Pengfei Xue, 34, both Chinese nationals, were sentenced today for participating in a sophisticated scheme to defraud Apple Inc. out of millions of dollars’ worth of iPhones. U.S. District Court Judge Timothy J. Kelly sentenced Sun to 57 months in prison, and sentenced Xue to 54 months in prison. [...]
According to the government’s evidence, between May 2017 and September 2019, Sun, Xue, and other conspirators defrauded Apple Inc. by submitting counterfeit iPhones to Apple Inc. for repair to get Apple to exchange them with genuine replacement iPhones. Sun and Xue received shipments of inauthentic iPhones from Hong Kong at UPS mailboxes throughout the D.C. metropolitan area. They then submitted the fake iPhones, with spoofed serial numbers and/or IMEI numbers, to Apple retail stores and Apple Authorized Service Providers, including the Apple Store in Georgetown. Trial evidence and evidence developed after trial showed that members of the conspiracy submitted more than 6,000 inauthentic phones to Apple during the conspiracy, causing an intended loss of approximately $3.8 million and an actual loss of more than $2.5 million.
This seems like a scam you might expect to get away with a few times. Maybe more than a few, if you keep taking the counterfeit iPhones to different stores. But 6,000?
Novaya Gazeta Europe:
Apple removed nearly 60 additional virtual private network (VPN) apps from its Russia App Store between July and September, significantly more than the 25 acknowledged by the Russian authorities, according to a report published on Tuesday by the Apple Censorship Project, which campaigns for greater transparency from Apple over such moves.
According to researchers at GreatFire, an organisation which monitors online censorship in China, data indicates that Apple silently removed nearly 60 VPN services from the Russia App Store between 4 July and 18 September, bringing the total number of VPN apps now unavailable in the country to 98.
The report suggests that the scale of online censorship in Russia is much greater than was previously acknowledged when Roskomnadzor, Russia’s media regulator, announced in early July that it would be blocking 25 VPN apps in the Russian App Store, including some of the world’s most popular services such as NordVPN, ExpressVPN and Proton VPN.
The kneejerk criticism to purges like this is to fault Apple for complying. But of course they have to comply. If Apple responded to this demand from the Russian government with “Nah, we’re not going to comply”, the Russian government would shut down the App Store in Russia. It’s the same reason Apple can’t just say “Nah” to complying with the DMA in the EU even though the company staunchly disagrees with the entirety of the DMA’s requirements. The law’s the law, whether the country is a brutal dictatorship or a liberal democracy.
The correct criticism to target at Apple is that this is the best argument against the App Store as the sole distribution channel of software for iOS. VPN software is still available for the Mac in Russia, and, I presume, is still available via sideloading for Android phones. When you create a choke point, you can be choked.
Christian Selig:
For those not aware, a few months ago after reaching out to me, YouTube contacted the App Store stating that Juno does not adhere to YouTube guidelines and modifies the website in a way they don’t approve of, and alludes to their trademarks and iconography.
I don’t personally agree with this, as Juno is just a web view, and acts as little more than a browser extension that modifies CSS to make the website and video player look more “visionOS” like. No logos are placed other than those already on the website, and the “for YouTube” suffix is permitted in their branding guidelines. Juno also doesn’t block ads in any capacity, for the curious.
I stated as much to YouTube, they wouldn’t really clarify or budge any, and as a result of both parties not being able to come to a conclusion I received an email a few minutes ago from Apple that Juno has been removed from the App Store.
This, to say the least, sucks. Juno is a wonderful VisionOS app — one of the very best third-party apps for the platform. It turns YouTube video watching from a totally meh experience inside Safari into a totally wow experience as a native app. It’s not like Juno was keeping people from using YouTube’s own native app because, famously, there isn’t one. A YouTube spokesperson told Nilay Patel at The Verge back in February that “a Vision Pro app is on our roadmap”, but as I wrote at the time, “given the design quality and adherence to platform design idioms of Google’s iOS apps (poor), I’m not sure they’re even capable of making a Juno-quality app.”
I still stand by that. I don’t expect to see YouTube launch a native VisionOS app soon, and even if they do, I doubt it’ll be anywhere near as good as Juno. What I was obviously wrong about in that February post was thinking that YouTube wouldn’t care about Juno’s existence, given that Juno did not block ads. All it did was make the YouTube experience great on Vision Pro.
This makes Selig — one of the most gifted indie developers working on Apple’s platforms today — 2 for 2 on getting hosed by big platforms for which Selig created exquisitely well-crafted clients. (The first, of course, was his beloved Reddit client Apollo.) If he goes 3 for 3, Phil Schiller should grant him a “trifecta” lifetime exemption from App Store commission fees.
The AP:
Technology reporter Taylor Lorenz said Tuesday that she is leaving The Washington Post, less than two months after the newspaper launched an internal review following her social media post about President Joe Biden.
Lorenz, a well-regarded expert on internet culture, wrote a book “Extremely Online” last year and said she is launching a newsletter, “User Mag,” on Substack.
Well-regarded by whom? Lorenz is a hack — a self-proclaimed social media expert done in by her own “private” Instagram post describing President Joe Biden as a “war criminal” that she subsequently lied about having posted. She didn’t “exit” the Post. She was obviously and rightfully fired.
This video from “MTT” warmed my heart. And that takes a lot. I learned Pascal on this keyboard. I absolutely loved this keyboard when I first encountered it. But, today, man, what a weird keyboard it is. I mean the arrow-key layout is one thing (up, down, left, right — arranged horizontally). But how about putting the backslash (\) key on the right of the space bar and the backtick (`) key on the left? I mean that’s just crazy. I recall absolutely loving the feel of this keyboard as a teenager but I’ve never bothered chasing one down in my adult life because I know today I could never bear the weird layout. But MTT didn’t just do the lazy thing (buy an ADB-USB adapter), he went the whole nine yards and designed and soldered his own custom parts to turn this 1986 gem into a modern day Bluetooth keyboard. Masterful.
Jason Snell returns to the show to discuss Apple’s September product announcements, and Meta’s Orion prototype AR glasses. Absolutely no baseball talk, almost.
Sponsored by:
Allen Pike:
As I understand it, my first experience in a self-driving car was typical:
- Minute 1: “How safe is this? Will it notice that cyclist? What about those construction cones?”
- Minute 10: “This is wild. It’s driving so calmly and safely. I love it.”
- Minute 20: (Bored, checking my email in the back seat.)
It was like a firmware update to my brain.
Imagine how exhilarating subways must have been a century ago — zipping across cities in high-speed underground trains. All technology becomes mundane quickly. It’s kind of amazing when you notice it happening with yourself.
Om Malik, after Apple’s September 9 “It’s Glowtime” event at Steve Jobs Theater:
I decided to become a fly on the wall and chronicle the spectacle unfolding in front of me. I focused on those who were there to create content about the devices, not the devices themselves. It was fun to just float among the crowds with my Nikon Zf and a 40mm lens.
It was a wonderful spectacle — just to bask in this new kind of raw media energy. Content for the sake of content. Events for the sake of content. Fog of content. It’s the new way of the world. As a student of media, I love this chaos and change — because from chaos and change comes the future.
I’m linking to this photo essay despite, not because of, the fact that it includes a portrait of yours truly dicking around on his phone in the small room where the media wait for post-event briefings. Steve Jobs Theater is a beautiful and unique space, but there are aspects of the space that are hard to capture in photos. Om’s collection here captures the feel of it.
I tried to return the favor by photographing the photographer.
See also: Om’s thoughts on the event and announcements.
The Cincinnati Enquirer:
Pete Rose, the Cincinnati native who became baseball’s all-time hits leader as well as one of the most divisive figures in the sport’s history, died Monday, according to a TMZ report, which was confirmed by his agent, Ryan Fiterman. He was 83.
After reaching the pinnacle of the sport he loved, Rose was banned from baseball in 1989 for gambling while manager of his hometown Reds. That came just four years after Rose had broken Ty Cobb’s hit record, a mark that still stands. He is MLB’s all-time hits leader with 4,256.
Even putting aside the betting scandal, Rose was, by all accounts, a rotten person — peculiar at best. But he was an astonishingly good and captivating baseball player, with a nickname for the ages: Charlie Hustle. He played with a maniacal intensity. When he drew a walk, he’d sprint to first base, because that’s the only way he knew how to traverse the bases: at full speed. He drew 1,566 walks in his career. I met him once, during his post-baseball career selling autographs at Las Vegas sports memorabilia shops. My favorite Rose play wasn’t a hit. It was this catch in game 6 of the 1980 World Series.
Simon Willison:
Audio Overview is a fun new feature of Google’s NotebookLM which is getting a lot of attention right now. It generates a one-off custom podcast against content you provide, where two AI hosts start up a “deep dive” discussion about the collected content. These last around ten minutes and are very podcast, with an astonishingly convincing audio back-and-forth conversation.
Here’s an example podcast created by feeding in an earlier version of this article (prior to creating this example).
I listened to the whole 15-minute podcast this morning. It was, indeed, surprisingly effective. It remains somewhere in the uncanny valley, but not at all in a creepy way. Just more in a “this is a bit vapid and phony” way. I think that if you played this example podcast for a non-technical person who isn’t informed at all about the current state of generative AI, that they would assume for the first few minutes, without question, that this was a recorded podcast between two actual humans, and that they might actually learn a few things about generative AI. But given that the “conversation” is literally about creating artificial podcasts like this very example, I wonder how many would, by the end, suspect that they were in fact listening to an AI-generated podcast? It’s quite meta — which the male voice on the podcast even says during the episode.
But ultimately the conversation has all the flavor of a bowl of unseasoned white rice. Give it a listen, though. It’s remarkable.
Update: Jiminy Christ, listen to this one, where the prompt was a document with nothing more than the words poop and fart repeated over and over.
My thanks to Method Financial for sponsoring last week at Daring Fireball. Method Financial’s authentication technology allows instant access to a consumer’s full liability portfolio using just personal information and consent, eliminating the need for usernames and passwords.
With just a few lines of code, Method’s APIs enable real-time, read-write, and frictionless access to all consumer liability data with integrated payment rails. Method leverages integrations with over 15,000 financial institutions to stream up-to-date, high-fidelity data from users’ accounts and to facilitate payment to them.
Method has helped 3 million consumers connect over 24 million liability accounts at companies like Aven, SoFi, Figure, and Happy Money, saving borrowers millions in interest and providing access to billions of dollars in personalized loans.
Jason Snell, Six Colors:
These are companies playing the same game, but in different ways. Who’s ahead? I would argue that it’s impossible to tell, because if Apple had a product like Orion we would never see it. We can argue about whether Apple’s compulsion to never, ever comment on unannounced products is beneficial or not, but it’s a Steve Jobs-created bit of Apple personality that is very unlikely to be countermanded any time soon.
Here’s how tenuous the Orion prototype is. Meta claims it would cost $10,000, but they haven’t said whether that would be the cost of goods or the retail price. But let’s give them the benefit of the doubt and say that the retail price would be just $10,000 if they brought this to market today. That’s expensive. But it’s not ridiculous. You can buy high-end workstation-class desktops that cost that much. A fully-specced 16-inch MacBook Pro costs $7,200.
But according to The Verge, these Orion prototypes only get 2 hours of battery life. And they’re too thick and chunky. You look weird, if not downright ugly, wearing them. So Meta not only needs to bring the price down by a factor of at least 3× (which would put it around the $3,500 price of Vision Pro, which most critics have positioned as too expensive), they also need to make the glasses smaller — more svelte — while increasing battery life significantly. Those two factors are in direct contradiction with each other. The only easy way to increase battery life is to put a bigger battery in the device, which makes the device itself thicker and heavier. (See this year’s iPhone 16 Pro.)
Orion by all accounts is a really compelling demo. But it’s also very clearly a prototype device only suitable for demos. Even at $10,000 retail it wouldn’t be compelling today. Yet somehow Meta wants us to believe they have “line of sight” to a compelling consumer product at a compelling price.
It’s exciting that they showed Orion publicly, but I don’t think it helped Meta in any way going forward. There’s a reason why Apple didn’t show off a prototype iPhone in 2004.
Every September, the whole extended family at Relay FM raises money for St. Jude Children’s Research Hospital, one of the most amazing institutions in the world. St. Jude is dedicated to curing childhood cancer and helping families affected by it. Since 2019 Relay has raised over $3 million, and their best-ever single month was just north of $775,000.
This year they’re already at $925,000, within earshot of a cool million, with three days to go in the month. Let’s make that happen.
Update, 30 September: And, boom, they hit it: $1,041,913.31 and still counting.
In the midst of recording last week’s episode of The Talk Show with Nilay Patel, I offhandedly mentioned the age-old trick of holding down the Shift key while minimizing a window (clicking the yellow button) to see the genie effect in slow motion. Nilay was like “Wait, what? That’s not working for me...” and we moved on.
What I’d forgotten is that Apple had removed this as default behavior a few years ago (I think in MacOS 10.14 Mojave), but you can restore the feature with this hidden preference, typed in Terminal:
defaults write com.apple.dock slow-motion-allowed -bool YES
Then restart the Dock:
killall Dock
Or, in a single command:
defaults write com.apple.dock slow-motion-allowed -bool YES; killall Dock
Or, if you prefer a proper app to a command-line invocation, Marcel Bresink’s excellent TinkerTool.
Tom Pritchard, writing at Tom’s Guide:
We put the iPhone 16, iPhone 16 Plus, iPhone 16 Pro and the iPhone 16 Pro Max through the Tom’s Guide battery test, which involves surfing the web over 5G at 150 nits of screen brightness. The iPhone 16 Pro Max and iPhone 16 Plus have risen to the top with some incredibly impressive results — making our best phone battery life list in the process. Here’s how the new iPhone 16 models’ battery life stacks up against their iPhone 15 counterparts, and rival flagships.
Tom Dotan and Berber Jin, reporting late last night for The Wall Street Journal (News+):
Apple is no longer in talks to participate in an OpenAI funding round expected to raise as much as $6.5 billion, an 11th hour end to what would have been a rare investment by the iPhone maker in another major Silicon Valley company. Apple recently fell out of the talks to join the round, which is slated to close next week, according to a knowledgeable person.
I just observed the other day that the tumultuous (to say the least) leadership situation at OpenAI seems incongruous with Apple’s.
Also surely related, to some degree, is this report on OpenAI’s financials that dropped yesterday from Mike Isaac and Erin Griffith at The New York Times:
OpenAI’s monthly revenue hit $300 million in August, up 1,700 percent since the beginning of 2023, and the company expects about $3.7 billion in annual sales this year, according to financial documents reviewed by The New York Times. OpenAI estimates that its revenue will balloon to $11.6 billion next year.
But it expects to lose roughly $5 billion this year after paying for costs related to running its services and other expenses like employee salaries and office rent, according to an analysis by a financial professional who has also reviewed the documents. Those numbers do not include paying out equity-based compensation to employees, among several large expenses not fully explained in the documents.
OpenAI: We lose a little on every sale but we make it up in volume.
iA, which has been shipping a version of iA Writer for Android for 7 years:
By September, we thought we had honored our side of the new agreement. But on the very day we expected to get our access back, Google altered the deal.
We were told that read-only access to Google Drive would suit our writing app better than the desired read/write access. That’s right — read-only for a writing app.
When we pointed out that this was not what we had, or what our users wanted, Google seemed to alter the deal yet again. In order to get our users full access to their Google Drive on their devices, we now needed to pass a yearly CASA (Cloud Application Security Assessment) audit. This requires hiring a third-party vendor like KPMG.
The cost, including all internal hours, amounts to about one to two months of revenue that we would have to pay to one of Google’s corporate amigos. An indie company handing over a month’s worth of revenue to a “Big Four” firm like KPMG for a pretty much meaningless scan. And, of course, this would be a recurring annual expense. More cash for Google’s partners, while small developers like us foot the bill for Android’s deeply ingrained security shortcomings.
Developing serious productivity apps for Android sounds like fun. (See also the footnote on how stunningly rampant piracy is on Android, too.)
Elizabeth Lopatto, reporting for The Verge:
X is preventing users from posting links to a newsletter containing a hacked document that’s alleged to be the Trump campaign’s research into vice presidential candidate JD Vance. The journalist who wrote the newsletter, Ken Klippenstein, has been suspended from the platform. Searches for posts containing a link to the newsletter turn up nothing.
Posting this just in case there remained an iota of a thought in your head that Elon Musk is actually a radical “free speech” absolutist and not just someone who blew $44 billion buying Twitter to warp the entire platform in the direction of his own weird un-American political agenda.
Nilay Patel returns to the show to consider the iPhones 16.
Sponsored by:
Deepa Seetharaman, Berber Jin, and Tom Dotan, reporting for The Wall Street Journal (News+):
OpenAI is planning to convert from a nonprofit organization to a for-profit company at the same time it is undergoing major personnel shifts including the abrupt resignation Wednesday of its chief technology officer, Mira Murati. Becoming a for-profit would be a seismic shift for OpenAI, which was founded in 2015 to develop AI technology “to benefit humanity as a whole, unconstrained by a need to generate financial return,” according to a statement it published when it launched.
I guess I wasn’t paying close enough attention, but I wrongly thought this whole debate over turning OpenAI into a for-profit corporation had been decided a year ago, during the brief saga when the then-board of directors fired Sam Altman for being profit-driven, and then the board itself dissolved and Altman was brought back.
Things started to change in late 2022 when it released ChatGPT, which became an instant hit and sparked global interest in the potential of generative artificial intelligence to reshape business and society. Guided by Chief Executive Sam Altman, OpenAI started releasing new products for consumers and corporate clients and hired a slew of sales, strategy and finance staffers. Employees, including some who had been there from the early days, started to complain that the company was prioritizing shipping products over its original mission to build safe AI systems.
Some left for other companies or launched their own, including rival AI startup Anthropic. The exodus has been particularly pronounced this year. Before Murati, OpenAI’s co-founder and former chief scientist Ilya Sutskever, co-founder and former top researcher John Schulman, and former top researcher Jan Leike all resigned since May. Co-founder and former president Greg Brockman recently took a leave of absence through the end of the year.
In addition to Murati, chief research officer Bob McGrew and head of post-training Barret Zoph also are leaving OpenAI, according to a post on X from Altman.
OpenAI has high-profile partnerships with both Microsoft and Apple, two companies with decades of extraordinarily stable executive leadership. But OpenAI itself seems to be in a state of constant executive disarray and turmoil. That’s a bit of a head-scratcher to me.
Rasmus Larsen, writing for FlatpanelsHD:
While reviewing LG’s latest high-end G4 OLED TV (review here), FlatpanelsHD discovered that it now shows full-screen screensaver ads. The ad appeared before the conventional screensaver kicks in, as shown below, and was localized to the region the TV was set to.
We saw an ad for LG Channels — the company’s free, ad-supported streaming service — but there can also be full-screen ads from external partners, as shown in the company’s own example below.
Death comes for us all.
Amazon, Google, and Roku have long built their respective TV monetization strategies around ads, and with LG and Samsung turning webOS and Tizen into digital billboards, the only refuge appears to be Apple TV 4K, which can be connected to any TV. You can now disconnect your TV from the internet.
I bought an LG OLED in 2020 that hasn’t been connected to the internet since a few days after we started using it. It’s a great TV.
Gerald Lynch, editor-in-chief:
Dig out your old iPod and fire up your ‘Songs to cry to’ playlist, I come bearing sad news. After more than 15 years covering everything Apple, it’s with a heavy heart I announce that we will no longer be publishing new content on iMore.
I want to kick off by thanking you all for your support over the many years and incarnations of the site. Whether you were a day-one early adopter in the ‘PhoneDifferent’ days, came on board with ‘The iPhone Blog,’ or recently started reading to find out what the hell Apple Vision Pro is, it’s been a privilege to serve you a daily slice of Apple pie.
So it goes. Nice remembrances from Rene Ritchie (now at YouTube) and Serenity Caldwell (now at Apple).
Just appended the following to my piece from yesterday on Meta’s Orion AR glasses prototype:
- Facebook ships VR headsets and a software platform with an emphasis so strong on “the metaverse” that they rename the company Meta.
- Apple announces, and then 7 months later ships, Vision Pro with a two-fold message in comparison: (a) the “metaverse” thing is so stupid they won’t even use the term; (b) overwhelmingly superior resolution and experiential quality. Consumer response, however, is underwhelming.
- Meta drops the “metaverse” thing and previews Orion, effectively declaring that they think VR headsets are the wrong thing to build to create the product that defines the next breakthrough step change in personal computing. AR glasses, not VR headsets, are the goal.
It’s a lot of back-and-forth volleying, which is what makes the early years of a new device frontier exciting and highly uncertain. Big bold ideas get tried out, and most of them wind up as dead ends to abandon. Compare and contrast to where we’ve been with laptops for the last 20 years, or the pinnacle we appear to have reached in recent years with phones.
I’ll link first to The Verge’s “Everything Announced at Meta Connect 2024” roundup because Meta still hasn’t posted today’s keynote address on YouTube; best I’ve found is this recording of the livestream, starting around the 43m:20s mark. I watched most of the keynote live and found it engaging. Just 45 minutes long — partly because it was information dense, and partly because Mark Zuckerberg hosted the entire thing himself. He seems very comfortable, confident, and natural lately. Nothing slows an on-stage keynote down more than a parade of VPs. There was clearly no political infighting at Meta for stage time in this keynote. The keynote was Zuck’s, and because of that, it was punchy and brisk.
In terms of actual products that will actually ship, Meta announced the $300 Quest 3S. That’s more than an entire order of magnitude lower-priced than Vision Pro. Vision Pro might be more than 10× more capable than Quest 3S, but I’m not sure it’s 10× better for just playing games and watching movies, which might be the only things people want to do with headsets at the moment. They also launched a 7,500-unit limited edition of their $430 actually-somewhat-popular Ray-Ban Wayfarer “smart” glasses made with translucent plastic, iMac-style. It’s been a while since someone made a “look at the insides” consumer device. That’s fun, and a little quirky, too.
The big reveal was Orion, a working prototype of see-through AR glasses. Meta themselves are describing them as a “dev kit”, but not only are they not available for purchase, they’re not available, period. They’re internal prototypes for Meta’s own developers, not outside developers. They do seem interesting, for a demo, and I’m hearing from our Dithering correspondent on the scene in Menlo Park that using them is genuinely compelling. There can be no argument that actual glasses are the form factor for AR.
The Verge’s Alex Heath opened his piece on Orion today with this line:
They look almost like a normal pair of glasses.
That’s stretching the meaning of “almost” to a breaking point. I’d say they look vaguely kinda-sorta like a pair of normal glasses. Both the frames (super chunky) and the lenses (thick, prismatic, at times glowing) are conspicuous. They look orthopedic, like glasses intended to aid people whose vision is so low they’re legally blind. It really is true that Meta’s Ray-Ban Wayfarers are nearly indistinguishable from just plain Wayfarers. Orion isn’t like that at all. If you went out in public with these — which you can’t, because they’re internal prototypes — everyone would notice that you’re wearing some sort of tech glasses, or perhaps think you walked out of a movie theater without returning the 3D goggles. But: you could wear them in public if you wanted to, and unlike going out in public wearing a VR headset, you’d just look like a nerd, not a jackass. They’re close to something. But how close to something that would actually matter, especially price-wise, is debatable. From Heath’s report:
As Meta’s executives retell it, the decision to shelve Orion mostly came down to the device’s astronomical cost to build, which is in the ballpark of $10,000 per unit. Most of that cost is due to how difficult and expensive it is to reliably manufacture the silicon carbide lenses. When it started designing Orion, Meta expected the material to become more commonly used across the industry and therefore cheaper, but that didn’t happen.
“You can’t imagine how horrible the yields are,” says Meta CTO Andrew Bosworth of the lenses. Instead, the company pivoted to making about 1,000 pairs of the Orion glasses for internal development and external demos.
Snap recently unveiled their new Spectacles, which they’re leasing, not selling, for $1,200 per year. Snap’s Spectacles are so chunky they make Orion look inconspicuous in comparison. But the race to bring AR glasses to market is clearly on.
See Also: Heath’s interview with Zuckerberg for Decoder.
Next-Day Addendum: I woke up this morning with the following competitive back-and-forth in my head:
It’s a lot of back-and-forth volleying, which is what makes the early years of a new device frontier exciting and highly uncertain. Big bold ideas get tried out, and most of them wind up as dead ends to abandon. Compare and contrast to where we’ve been with laptops for the last 20 years, or the pinnacle we appear to have reached in recent years with phones. ★
Reuters:
Masimo said on Wednesday founder Joe Kiani has decided to step down as the medical device maker’s CEO, days after shareholders voted to remove him from the company’s board following a bitter proxy battle with activist hedge fund Politan Capital Management.
The company named veteran healthcare executive, Michelle Brennan, as interim chief. Brennan was nominated by Politan for Masimo’s board last year, along with the hedge fund’s founder Quentin Koffey. Both were subsequently elected by shareholders.
Shares of the company were up 5.4% at $133 in early trade. The stock has fallen more than 40% since Feb. 15, 2022, when Masimo announced the $1-billion acquisition of audio products maker Sound United. The deal was a key factor behind Politan’s activism.
No mention of Apple in Reuters’s report, but I wouldn’t be surprised if there’s soon a settlement in the patent dispute over the blood oxygen sensors in recent Apple Watch models that’s left the feature disabled for new watches sold in the U.S. this year. My understanding is that Kiani was single-mindedly obsessed with fighting Apple over this.
Holds hand to earpiece... Correction: they’re bunking in a jail cell together in Brooklyn, not Gotham. We regret the error.
A swing-and-a-miss from MKBHD. Criticism of the app is on two separate levels, but they’re being conflated. Level 1: the app is not good. Level 2: a paid wallpaper app? — LOL, wallpapers are free on Reddit. That second form of criticism — that there shouldn’t even exist a paid wallpaper app — is annoying and depressing, and speaks to how many people do not view original art as worth paying for. But it also speaks to the breadth of Brownlee’s audience, which I think is more tech-focused than design-focused. Scott Smith, on Mastodon, observed:
It’s really interesting to compare the reaction from the indie iOS community of @Iconfactory’s Wallaroo to the mainstream tech community’s reaction to @mkbhd’s Panels. I know they are not the same by any means but it sheds light on how many people in mainstream tech circles are still flabbergasted at paying for artwork.
So there’s that, and it is what it is. To some extent that freeloading cheapskate perspective can be ignored. If one’s argument is that all wallpapers ought to be free, that’s not a valid starting point for criticism of a paid wallpaper app/service.
The problem is, Panels is not a good app:
It crashed on me during first run on iPhone.
The UI is big and bulbous, and while it looks almost the same on iOS and Android (which is probably why it’s so crude), it looks native on neither platform. It looks and feels more like the interface to a game than an app. If anything, it looks and feels more Android-y than iOS-y, if only because “doesn’t really look native anywhere” is more of an Android thing. If Brownlee is down with how this app looks and feels, it explains quite a bit (more) about how he’s willing to spend large stretches of time daily-driving Android phones.
Totally subjective, but I don’t think the wallpapers themselves are good. I mean like none of them. They feel like user-generated content, not professional content curated by a trusted tastemaker like Brownlee.
The app has a crummy privacy report card, including using your general location for tracking, and on iOS brings up the “Ask App Not to Track” dialog. It’s even worse on Android. Not premium. (Panels doesn’t ask for GPS location access, but it uses your IP address for tracking, which Apple classifies as “location”. Apple ought to clarify that distinction in App Store privacy report cards — asking for GPS is not the same thing at all as IP-based geolocation — but it’s a bad look for the app.)
“SD” (1080p) wallpapers are free to download from some creators but require watching a minute or two of video ads. Not premium.
Subscribing costs $50/year or $12/month ($144 a year!) — which are, to say the least, premium prices. (Wallaroo is a far better app with — subjectively — far better wallpapers and costs $20/year or $2/month.)
It’s entirely plausible for a premium wallpaper app to justify a price of $50/year. But Panels isn’t a premium app. Premium apps don’t ask to track you across apps. Premium apps don’t make you watch ads to get to their free content. Premium apps look and feel native to their platforms. Premium apps don’t have sketchy privacy report cards. As it stands, Panels is incongruous and incoherent. ★
Juli Clover, MacRumors:
With the iPhone 15 models that came out last year, Apple added an opt-in battery setting that limits maximum charge to 80 percent. The idea is that never charging the iPhone above 80 percent will increase battery longevity, so I kept my iPhone at that 80 percent limit from September 2023 to now, with no cheating.
My iPhone 15 Pro Max battery level is currently at 94 percent with 299 cycles. For a lot of 2024, my battery level stayed above 97 percent, but it started dropping more rapidly over the last couple of months. I left my iPhone at that 80 percent limit and at no point turned the setting off or tweaked it. [...] You can compare your level battery to mine, but here are a couple other metrics from MacRumors staff that also have an iPhone 15 Pro Max and did not have the battery level limited.
- Current capacity: 87%. Cycles: 329
- Current capacity: 90%. Cycles: 271
My year-old iPhone 15 Pro (not Max) which I simply used every day and charged to 100 percent overnight: max capacity: 89 percent, 344 charge cycles.
I’m so glad Clover ran this test for a year and reported her results, because it backs up my assumption: for most people there’s no practical point to limiting your iPhone’s charging capacity. All you’re doing is preventing yourself from ever enjoying a 100-percent-capacity battery. Let the device manage its own battery. Apple has put a lot of engineering into making that really smart.
Donald Papp at Hackaday:
There’s a wild new feature making repair jobs easier (not to mention less messy) and iFixit covers it in their roundup of the iPhone 16’s repairability: electrically-released adhesive.
Here’s how it works. The adhesive looks like a curved strip with what appears to be a thin film of aluminum embedded into it. It’s applied much like any other adhesive strip: peel away the film, and press it between whatever two things it needs to stick. But to release it, that’s where the magic happens. One applies a voltage (a 9V battery will do the job) between the aluminum frame of the phone and a special tab on the battery. In about a minute the battery will come away with no force, and residue-free.
Clever.
Nilay Patel, writing at The Verge last week:
I asked Apple’s VP of camera software engineering Jon McCormack about Google’s view that the Pixel camera now captures “memories” instead of photos, and he told me that Apple has a strong point of view about what a photograph is — that it’s something that actually happened. It was a long and thoughtful answer, so I’m just going to print the whole thing:
Here’s our view of what a photograph is. The way we like to think of it is that it’s a personal celebration of something that really, actually happened.
Whether that’s a simple thing like a fancy cup of coffee that’s got some cool design on it, all the way through to my kid’s first steps, or my parents’ last breath, It’s something that really happened. It’s something that is a marker in my life, and it’s something that deserves to be celebrated.
And that is why when we think about evolving in the camera, we also rooted it very heavily in tradition. Photography is not a new thing. It’s been around for 198 years. People seem to like it. There’s a lot to learn from that. There’s a lot to rely on from that.
Think about stylization, the first example of stylization that we can find is Roger Fenton in 1854 — that’s 170 years ago. It’s a durable, long-term, lasting thing. We stand proudly on the shoulders of photographic history.
That’s a sharp and clear answer, but I’m curious how Apple contends with the relentless addition of AI editing to the iPhone’s competitors. The company is already taking small steps in that direction: a feature called “Clean Up” will arrive with Apple Intelligence, which will allow you to remove objects from photos like Google’s Magic Eraser.
McCormack’s response is genuinely thoughtful, and resonates deeply with my own personal take. But it’s worth noting that Apple is the conservative company when it comes to generative AI and photography — and yet they’re still shipping Clean Up. I’m not complaining about Clean Up’s existence. I’ve already used it personally. I’m just saying that even Apple’s stance involves significant use of generative AI.
Mark Wilson, writing at Fast Company:
Now, a full five years later, are we meeting the LoveFrom mascot, Montgomery: a bear, paying homage to San Francisco’s Montgomery Street where LoveFrom is headquartered and will soon open its own store.
Montgomery has just appeared on LoveFrom’s website, where it will sniff and follow your cursor, before slowly navigating over the letters of LoveFrom like rocks in a pond.
A lovely mark and even lovelier animation.
One of the many memorable moments in Steve Jobs’s 2007 introduction of the original iPhone was this slide showing four of the then-leading smartphones on the market. Jobs explained:
Now, why do we need a revolutionary user interface? Here’s four smartphones, right? Motorola Q, the BlackBerry, Palm Treo, Nokia E62 — the usual suspects. And, what’s wrong with their user interfaces? Well, the problem with them is really sort of in the bottom 40 there. It’s this stuff right there. They all have these keyboards that are there whether or not you need them to be there. And they all have these control buttons that are fixed in plastic and are the same for every application. Well, every application wants a slightly different user interface, a slightly optimized set of buttons, just for it.
And what happens if you think of a great idea six months from now? You can’t run around and add a button to these things. They’re already shipped. So what do you do? It doesn’t work because the buttons and the controls can’t change. They can’t change for each application, and they can’t change down the road if you think of another great idea you want to add to this product.
Well, how do you solve this? Hmm. It turns out, we have solved it. We solved it in computers 20 years ago. We solved it with a bitmapped screen that could display anything we want. Put any user interface up. And a pointing device. We solved it with the mouse. We solved this problem. So how are we going to take this to a mobile device? What we’re going to do is get rid of all these buttons and just make a giant screen. A giant screen.
At the time, what seemed most radical was eschewing a hardware QWERTY keyboard and instead implementing a touchscreen keyboard in software. Steve Ballmer, then CEO of Microsoft, in the infamous clip in which he laughed uproariously after being asked for his reaction to seeing the iPhone: “500 dollars? Fully subsidized, with a plan? I said, that is the most expensive phone in the world, and it doesn’t appeal to business customers because it doesn’t have a keyboard, which makes it not a very good email machine.”
Apple didn’t get rid of all the buttons, of course. But the buttons they kept were all for the system, the device, not for any specific application: power, volume, a mute switch (that, oddly, was copied by almost no competitors), and the lone button on the front face: Home.1 That’s it.
When Apple’s competitors stopped laughing at the iPhone and started copying it, they got rid of their hardware keyboards — theretofore the primary signifier differentiating a “smartphone” from a regular phone — but they couldn’t bring themselves to eliminate the not one but two dedicated hardware buttons that, to their unimaginative minds, were inherent to making any cell phone a phone: the green “call” and red “hang up” buttons. Android phones had those red/green buttons. The BlackBerry Storm had them too. Every phone but the iPhone had them. Until they caught up and realized those buttons were obviated too.
The thinking might have been rooted in the very name of the devices. Of course all phones — dumb phones, BlackBerry-style hardware-keyboard phones, iPhone-style touchscreen phones — ought to have phone buttons. I suspect they pondered very deeply how Apple was bold enough to eschew a hardware keyboard for an all-touchscreen design, but that they thought Apple was just taking minimalism to its extreme by eschewing green/red hardware call buttons. No matter how many other things they do, they’re phones first — it’s right there in their name!
But the iPhone has never really been fundamentally a telephone. On the iPhone, the Phone was always just another app. A special app, no question. Default placement in the Dock at the bottom of the Home Screen. Special background privileges within an otherwise highly constrained OS where most apps effectively quit when you’d go back to the Home Screen. Incoming phone calls instantly took over the entire screen. Jobs spent a lot of time in that introduction demonstrating the Phone app — including Visual Voicemail, a genuine breakthrough feature that required AT&T/Cingular’s cooperation on the back end.2
But, still, the Phone part of iPhone was then and remains now just an app. If you compared an iPhone to an iPod Touch, there was nothing on the iPhone hardware that indicated it was any more of a phone than the iPod Touch, which not only wasn’t a phone but didn’t even offer cellular networking. No buttons, for sure. No stick-out antenna. No carrier logo on the device. Look at a modern iPhone and there’s really only one function whose purpose is clearly visible from a conspicuous hardware protrusion: the camera lenses. Five years ago, in the lede of my review of the iPhones 11, I wrote, “A few weeks ago on my podcast, speculating on the tentpole features for this year’s new iPhones, I said that ‘iCamera’ would be a far more apt name than ‘iPhone’.”
What more proof of the camera’s singular importance to the iPhone would one need than the ever-growing block of camera lenses on the back of each year’s new models, or the “Shot on iPhone” ad campaign — the longest-running (and still ongoing) campaign in Apple’s history? A dedicated hardware button?
The facile take is that Apple has run out of hardware ideas and now just adds a new button to the iPhone each year — Action button last year, Camera Control this year, maybe they’ll finally add those green/red phone call buttons next year. But that’s underestimating just how radical it is for Apple, in the iPhone’s 18th annual hardware iteration, to add a hardware button dedicated to a single application.
And I mean application there in the general sense, not just the app sense. By default, of course, pressing Camera Control launches the iOS Camera app,3 but while setting up any new iPhone 16, Apple’s own onboarding screen describes its purpose as launching “a camera app”, with a lowercase c. Any third-party app that adopts new APIs and guidelines can serve as the camera app that gets launched (and, once launched, controlled) by Camera Control. (I’ll default to writing about using the system Camera app, though.)
Apple seemingly doesn’t ever refer to Camera Control as a “button”, but it is a button. You can see it depress, and it clicks even when the device is powered off (unlike, say, the haptic Touch ID Home button on iPhones of yore and the long-in-the-tooth iPhone SE). But it isn’t only a button. You can think of it as two physical controls in one: a miniature haptic slider (like a trackpad with only one axis) and an actually-clicking button.
When the Camera app is not already in shoot mode (whether your iPhone is on the Lock Screen or if another app is active — or even if you’re doing something else inside the Camera app other than shooting, like, say, reviewing existing photos):
When the Camera app is active and ready to shoot:
Just writing that all out makes it sound complicated, and it is a bit complex. (Here’s Apple’s own illustrated guide to using Camera Control.) Cameras are complex. But if you just mash it down, it takes a picture. Camera Control is like a microcosm of the Camera app itself. Just want to point and shoot? Easy. Want to fiddle with ƒ-stops and styles? There’s a thoughtful UI to do that. In the early years of iPhone, Apple’s Camera app was truly point-and-shoot simplistic. The shooting interface had just a few buttons: a shutter, a photo/video toggle, a control for the flash, and a toggle for switching to the front-facing camera. The original iPhone and iPhone 3G didn’t even support video, and the front-facing camera didn’t arrive until the iPhone 4. Those old iPhones had simple camera hardware, and the app reflected that simplicity.
Apple’s modern camera hardware has become remarkably sophisticated, and the Camera app has too. But if you just want to shoot what you see in the viewfinder, it’s as simple as ever. Pinch to zoom, tap to focus, press the shutter button to shoot. But so many other controls and options are there, readily available and intelligently presented for those who want them, easily ignored by those who don’t. Apple’s Camera app is one of the best — and best-designed — pieces of software the world has ever seen. It’s arguably the most-copied interface the world has ever seen, too. You’d be hard-pressed to find a single premium Android phone whose built-in camera app doesn’t look like Apple’s, usually right down to the yellow accent color for text labels.
After over a week using several iPhone 16 review units, my summary of Camera Control is that it takes a while to get used to — I feel like I’m still getting used to it — but it already feels like something I wouldn’t want to do without. It’s a great idea, and a bold one. As I emphasized above, only in the 18th hardware revision has Apple added a hardware control dedicated to a single application. I don’t expect Apple to do it again. I do expect Apple’s rivals to copy Camera Control shamelessly.
At first, though, I was frustrated by the physical placement of Camera Control. As a hobbyist photographer who has been shooting with dedicated cameras all the way back to the late 1990s, my right index finger expects a shutter button to be located near the top right corner. But the center of Camera Control is 2 inches (5 cm) from the corner. I’ll never stop wishing for it to be closer to the corner, but after a week I’ve grown acclimated to its actual placement. And I get it. I’m old enough that I shoot all of my videos and most of my photos in widescreen orientation. But social media today is dominated by tallscreen video. As Apple’s Piyush Pratik explained during last week’s keynote, Camera Control is designed to be used in both wide (landscape) and tall (portrait) orientations. Moving it more toward the corner, where my finger wants it to be, would make it better for shooting widescreen, but would make it downright precarious to hold the iPhone while shooting tall. I hate to admit it but I think Apple got the placement right. Shooting tallscreen is just way too popular. And, after just a week, my index finger is getting more and more accustomed to its placement. It might prove to be a bit of a reach for people with small hands, though.
I’ve also been a bit frustrated by using Camera Control to launch Camera while my iPhone is locked. With the default settings, when your iPhone is unlocked, or locked but with the screen awake, a single click of Camera Control takes you right to shooting mode in the Camera app. That sounds obvious, and it is. But, when your iPhone is locked and the screen is off, or in always-on mode, clicking Camera Control just wakes up the screen. You have to click it again, after the screen is awake, to jump to shooting mode. Apple’s thinking here is obvious: they want to prevent an accidental click of Camera Control while it’s in your pocket or purse from opening Camera. Unlike almost every other mode you can get into on an iPhone, when you’re in shooting mode in Camera, the device won’t go to sleep automatically after a minute or two of inactivity. The current default in iOS 18, in fact, is to auto-lock after just 30 seconds. (Settings → Display & Brightness → Auto-Lock.) In shooting mode, the Camera app will stay open for a long time before going to sleep. You don’t want that to happen inadvertently while your iPhone is in your pocket.
But what I’ve encountered over the last week are situations where my iPhone is in my pocket, and I see something fleeting I want to shoot. This happened repeatedly during a Weezer concert my wife and I attended last Friday. (Great show.) What I want is to click Camera Control while taking the iPhone out of my pocket, and have it ready to shoot by the time I have it in front of my eyes. That’s how the on/off button works on dedicated cameras like my Ricoh GR IIIx. But with an iPhone 16, more often than not, the single click of Camera Control while taking the iPhone out of my pocket has only awakened the screen, not put it into shooting mode. I need to click it again to get into shooting mode. With a fleeting moment, that’s enough to miss the shot you wanted to take. The whole point of this is being a quick-draw gunslinger.
Apple offers a more-protective option in Settings → Camera → Camera Control → Launch Camera to require a double click, rather than single click, to launch your specified camera app. As I write this, I wish that they also offered a less-protective option to always launch your camera app on a single click, even if the phone is locked and the screen is off. A sort of “I’ll take my chances with accidental clicks” option. It’s possible though that Apple tried this, and found that inadvertent clicks are just too common. But as it stands, there’s no great way to always jump into shooting mode as quickly as possible.
When the iPhone is locked and the screen is off, a double click of Camera Control will jump you into shooting mode. I started doing that over the weekend, and at first I thought it satisfied my desire. But the problem with that is that if the iPhone is locked but the screen is already awake, a double click on Camera Control will jump into Camera on the first click and snap a photo with the second. I’ve had to delete at least half a dozen blurry accidental shots because of that.
A gesture that would avoid accidental invocations is clicking-and-holding the Camera Control button. In theory Apple could offer that as a surefire way to launch Camera while taking your iPhone out of your pocket. But Apple has reserved the click-and-hold gesture for visual intelligence, a new Apple Intelligence feature announced last week. That’s the feature that will put the nail in the coffin of dedicated “AI” devices like Humane’s AI Pin and Rabbit’s R1. Visual intelligence isn’t yet available, even in the developer betas of iOS 18.1, but the click-and-hold gesture is already reserved for it.4
So where I’ve landed, at this writing, is trying to remember only to double-click Camera Control while taking my iPhone out of my pocket to shoot, and just sucking it up with the occasional blurry unwanted shot when I double-click Camera Control when the screen is already awake. The only other technique I can think to try is to remember to always wait until I see that the screen is awake before clicking Camera Control, tilting the phone if necessary to wake it, but that would seemingly defeat the purpose of getting into shooting mode as quickly as possible.
By default, if you light-press-and-hold on Camera Control, nearly all of the UI elements disappear from the viewfinder screen. The shooting mode picker (Cinematic, Video, Photo, Portrait, Spatial, etc.), the zoom buttons (0.5×, 1×, 2×, 5×), the front/rear camera toggle, the thumbnail of your most recent photo — all of that disappears from the screen, leaving it about as uncluttered as the original iPhone Camera interface. Think of it as a half-press while using Camera Control as a shutter button. Dedicated hardware cameras have, for many decades, offered two-stage shutter buttons that work similarly. With those dedicated cameras, you press halfway to lock in a focus distance and exposure; then you can move the camera to recompose the frame while keeping the focus distance and exposure locked, before pressing fully to capture the image. Apple has promised to bring this feature to the Camera app for all iPhone 16 models in a software update “later this year”. (It’s not there yet in iOS 18.1 beta 4.) Camera Control does not have quite the same precise feel as a true two-stage shutter button that physically clicks at two separate points of depression (two detents), but it might eventually, in future iPhone models.
One issue with Camera Control is that because it’s capacitive, it’s tricky for case makers. The obvious solution is to just put a cutout around it, letting the user’s finger touch the actual Camera Control button. Apple’s more elegant solution, on their own silicone and clear cases and the new glossy polycarbonate cases from their subsidiary Beats, is “a sapphire crystal, coupled to a conductive layer to communicate the finger movements to the Camera Control”. That doesn’t sound like something you’re going to see in cheap $20 cases. In my testing, both with Apple’s cases and Beats’s, it works fairly seamlessly. I do think you lose some of the feel from the haptic feedback on light presses, though. Ultimately, Camera Control makes it more true than ever before that the best way to use an iPhone is without a case.
One more thing on Camera Control. Of the features that are adjustable via Camera Control (again: Exposure, Depth (ƒ-stop), Zoom, Cameras, Style, Tone), “Cameras” is an easily overlooked standout. Zoom offers continuous decimal increments from 0.5× to 25.0×. That is to say, you can slide your finger to get zoom values like 1.7×, 3.3×, 17.4×, etc. I almost never want that. I want to stick to the precise true optical increments: 0.5×, 1×, 2×, and 5×. That’s what the “Cameras” setting mode offers. Think of it as Zoom, but only with those precise values. (Instead of “Cameras”, this setting could have been called “Lenses”, but that’s potentially confusing because 1× and 2× both come from the same physical lens; the difference is how the sensor data is treated.) In fact, I wish I could go into Settings and disable Zoom from the list of features available in Camera Control. If I ever really want a non-optical zoom level, I can use the existing on-screen interface options.
What’s obvious is that Camera Control clearly was conceived of, designed, and engineered by photography aficionados within Apple who are intimately familiar with how great dedicated cameras work and feel. It surely must have been championed, politically, by the same group. It’s really just rather astounding that there is now a hardware control dedicated to photography on all new iPhones — and a mechanically complex control at that.
As usual, I’ll leave it to other reviewers to do in-depth pixel-peeping comparisons of image quality, but suffice it to say, to my eyes, the iPhone 16 Pro (the review unit I’ve been daily driving this past week) camera seems as great as usual.
The big new photographic feature this year has nothing to do with lenses or sensors. It’s a next-generation Photographic Styles, and it’s effectively “RAW for the rest of us”. This has always been the edge of my personal photographic nerdery/enthusiasm. I care enough about photography to have purchased numerous thousand-ish dollar cameras (and lenses) over the decades, but shooting RAW has never stuck for me. I understand what it is, and why it is technically superior to shooting JPEG/HEIC, but it’s just too much work. RAW lets you achieve better results through manual development in post, but you have to develop in post because raw RAW images (sorry) look strikingly flat and unsaturated. For a while I tried shooting RAW + JPEG, where each image you take is stored both as a straight-off-the-sensor RAW file and a goes-through-the-camera-imaging-pipeline JPEG file, but it turned out I never ever went back and developed those RAW images. And relative to JPEG/HEIC (which, henceforth, I’m just going to call “shooting JPEG” for brevity, even though iPhones have defaulted to the more-efficient HEIC format since iOS 11 seven years ago), RAW images take up 10× (or more) storage space.
It’s just too much hassle. The increase in image quality I can eke out developing RAW just isn’t worth the effort it takes — for me. For many serious photographers, it is. Everyone has a line like that. Some people don’t do any editing at all. They never crop, never change exposure in post, never apply filters — they just point and shoot and they’re done. For me, that line is shooting RAW.
Apple first introduced Photographic Styles with the iPhones 13 three years ago, with four self-descriptive primary styles: Rich Contrast (my choice), Vibrant, Warm, and Cool. Each primary style offered customization. Find a style you like, set it as your default, and go about your merry way. But whatever style you chose was how your photos were “developed” by the iPhone hardware imaging pipeline. Apple’s “filters” have been non-destructive for years, but the first generation of Photographic Styles are baked into the HEIC files it writes to storage.
With the iPhone 16 lineup, this feature is now significantly more powerful, while remaining just as convenient and easy to use.5 Apple eliminated what used to be called “filters” and recreated the better ones (e.g. Vibrant and Dramatic) as styles. There are now 15 base styles to choose from, most of them self-descriptively named (Neutral, Gold, Rose Gold), some more poetically named (Cozy, Quiet, Ethereal). The default style is named Standard, and it processes images in a way that looks, well, iPhone-y. The two that have me enamored thus far are Natural and Stark B&W. Standard iPhone image processing has long looked, to many of our eyes, at least slightly over-processed. Too much noise reduction, too much smoothing. A little too punchy. Natural really does look more natural, in a good way, to my eyes. Stark B&W brings to mind classic high-contrast black-and-white films like Kodak Tri-X 400.
A key aspect of Photographic Styles now is that they’re non-destructive. You can change your mind about any of it in post. Set your default to Stark B&W and later on, editing in Photos, you can change your mind and go back to a full-color image using whichever other style you want. There’s a lot of complex image processing going on behind the scenes — both in the iPhone 16 hardware and iOS 18 software — to make this seem like no big deal at all. But because the new Photographic Styles are largely (or entirely?) based on the hardware imaging pipeline, iPhones 13–15 will continue to use the first-generation Photographic Styles, even after upgrading to iOS 18.
I’ve always felt a little guilty about the fact that I’m too lazy to shoot RAW. This next-generation Photographic Styles feature in the iPhone 16 lineup might assuage, I suspect, the remaining vestiges of that guilt.
Apple kindly supplied me with all four models in the iPhone 16 lineup for review: the 16 in ultramarine, 16 Plus in pink, 16 Pro in natural titanium, and 16 Pro Max in desert titanium. Ultramarine is my favorite color color on any iPhone in memory. It’s a fun poppy blue, and quite vibrant. Pink is good too, with to my (and my wife’s) eyes, a touch of purple to it. The colors are extra saturated on the camera bumps, which looks great. Natural titanium looks extremely similar, if not identical, to the natural titanium on last year’s iPhone 15 Pro. (Apple’s own Compare page makes it appear as though this year’s natural titanium is noticeably lighter than last year’s, but here’s a photo from me showing a natural 15 Pro Max and 16 Pro side-by-side.) Desert titanium is sort of more gold than tan, but there is some brown to it, without rendering it the least bit Zune-like.
In short, the regular iPhone 16 offers some colors that truly pop. The iPhone 16 Pro models remain, as with all previous “Pro” iPhone colorways, staid shades of gray. White-ish gray, gray gray, near-black gray, and now desert gray.
I always buy black, or the closest to black Apple offers, and this tweet I wrote back in 2009 remains true, so the only year I’ve ever had a “which color to buy?” personal dilemma was 2016 with the iPhones 7, which Apple offered in both a matte “black” and Vader-like glossy “jet black”.6 I still kind of can’t believe Apple offered two utterly different blacks in the same model year.
But “which model to buy?” is sometimes more of a dilemma for yours truly. In 2020 I bought a regular iPhone 12, not the 12 Pro, on the grounds that it weighed less and felt better in hand than the Pro models. Whatever the non-pro iPhone 12 lacked in photographic capabilities wouldn’t matter so much, I correctly guessed, while I remained mostly homebound during the COVID epidemic. But I was also tempted, sorely, by the 12 Mini, and in hindsight I really don’t remember why that’s not the model I bought that year.
It’s a good thing, and a sign of strength for Apple, when the regular iPhone models are extremely appealing even to power users. It seemed like an artificial restriction last year, for example, that only the 15 Pro model got the new Action button. The year prior, only the 14 Pro models got the Dynamic Island; the regular iPhone 14 models were stuck with a no-fun notch. If you’re fairly deep into the weeds regarding TSMC’s first-generation 3nm fabrication, it makes sense why only the iPhone 15 Pro models got a new chip (the A17 Pro — there was no regular A17) while the iPhone 15 models stayed on the year-old A16, but still, that was a bummer too. This year, the regular 16 and 16 Plus not only get the Action button, they get the new Camera Control too (which, as I opined above, would make more sense as a “pro” feature than the Action button did last year), and a new A18 chip fabricated with TSMC’s second-generation 3nm process.
For my own use I’ve preordered an iPhone 16 Pro. But for the first time since the aforementioned iPhone 12 in 2020, I was genuinely tempted by the regular iPhone 16. The biggest functional difference between the 16 and 16 Pro models is that only the 16 Pros have a third telephoto lens. Last year, the 15 Pro Max went to 5×, but the 15 Pro remained at 3×. This year, both the 16 Pro and 16 Pro Max have the 5× telephoto lens. I tend to think I seldom use the telephoto lens, but it turns out I used it a little more in the last year than I would have guessed. Using smart albums in Photos to sort images by camera and lens, it looks like out of 3,890 total photos I shot with my iPhone 15 Pro, the breakdown by camera lens went like this:
Camera | Optical Zoom | Photos | Percentage |
---|---|---|---|
Ultrawide | 0.5× | 338 | 9% |
Main | 1×/2× | 3,076 | 79% |
Telephoto | 3× | 476 | 12% |
And, eyeballing the photos in that telephoto lens smart album, for most of them, I could have used a little more reach. I don’t expect to use 5× more often than I used 3×, but I expect to get better shots when I do. But it’s also the case that a fair number of the photos in that telephoto smart album are shots I just don’t care about that much. I do use the telephoto lens, and I look forward to having a 5× one instead of 3×, but I could live without it entirely and not miss it too much. (I only have 8 videos shot using 3× from the last year. Longer lenses are not good focal lengths for handheld video.)
Aesthetically, the two-lens arrangement on the back of the iPhones 16 and 16 Plus is far more pleasing than the three-lens triangle-in-a-square arrangement on the iPhones 16 Pro and 16 Pro Max.
For the last few years (the iPhone 13, 14, and 15 generations), the aesthetic difference in the back camera systems hasn’t been so striking, because Apple placed the non-pro iPhones’ two lenses in a diagonal arrangement inside a square block. The two lenses on the backs of the iPhones 11 and 12 were aligned on the same axis (vertical, when holding the phone in tallscreen orientation), but they were still inside a raised square. You’d have to go back to 2018’s iPhone XS to find a two-lens iPhone with the iPhone 16’s pleasing pill-shaped bump.
Either you care about such purely aesthetic concerns or you don’t. I care. Not enough to purchase an iPhone 16 instead of a 16 Pro, but it was a factor. The iPhone 16 and 16 Plus simply look more pleasing from the back and feel better in hand, especially caseless, than any iPhone since 2018.
Here’s the pricing for the entire iPhone 16 lineup:
Model | 128 GB | 256 GB | 512 GB | 1 TB |
---|---|---|---|---|
16 | $800 | $900 | $1,100 | — |
16 Plus | $900 | $1,000 | $1,200 | — |
16 Pro | $1,000 | $1,100 | $1,300 | $1,500 |
16 Pro Max | — | $1,200 | $1,400 | $1,600 |
But perhaps a better way to compare is by size class. Regular size:
Model | 128 GB | 256 GB | 512 GB | 1 TB |
---|---|---|---|---|
16 | $800 | $900 | $1,100 | — |
16 Pro | $1,000 | $1,100 | $1,300 | $1,500 |
And big-ass size:
Model | 128 GB | 256 GB | 512 GB | 1 TB |
---|---|---|---|---|
16 Plus | $900 | $1,000 | $1,200 | — |
16 Pro Max | — | $1,200 | $1,400 | $1,600 |
At both size classes, it’s a $200 delta to go from the regular model to its Pro equivalent. Looking at Apple’s excellent-as-always Compare page, here are the advantages/exclusive features that jump out to me for the 16 Pro models, other than the extra telephoto camera lens, roughly in the order in which I personally care:
I think it’s amazing that the iPhone Pro models are now able to shoot professional-caliber video. But I don’t shoot video professionally. And because I don’t, I can’t remember the last time I needed to transfer data from my iPhone via the USB-C port, so, while the Pro models offer a noticeable advantage in USB performance, I might never use it personally over the next year.
Another difference is that the 16 Pro models have slightly bigger displays than the regular 16 models. The 16 Pro and 16 Pro Max are 6.3 and 6.9 inches; the regular 16 and 16 Plus are 6.1 and 6.7. Whether that’s actually an advantage for the Pro models depends on whether you care that they’re also slightly taller and heavier devices in hand.
I omitted from the above comparison the one spec people care most about: battery life. Here is the sleeper spec where the Pro models earn their keep. Once again grouping like-vs.-like size classes, and including the 15 Pro models for year-over-year comparison:
Model | Video | Video (streamed) |
---|---|---|
15 Pro | 23 hours | 20 hours |
16 | 22 hours | 18 hours |
16 Pro | 27 hours | 22 hours |
15 Pro Max | 29 hours | 25 hours |
16 Plus | 27 hours | 24 hours |
16 Pro Max | 33 hours | 29 hours |
Those battery life numbers come from Apple, not my own testing (and Apple cites them as “up to” numbers). But those numbers suggest 20 percent longer battery life on the 16 Pro models compared to their size-class non-pro counterparts. Anecdotally, that feels true to me. I use a Shortcuts automation to turn on Low Power mode whenever my iPhone battery level drops below 35 percent. With my iPhone 15 Pro, that generally happens every night at some point. Over the last week using the iPhone 16 Pro as my primary iPhone, it hasn’t dropped that low most nights. To say the least, that’s not a rigorous test in any way, shape, or form. But Apple has no history of exaggerating battery life claims, especially relative comparisons between devices. I think it’s the real deal, and the 16 Pro and 16 Pro Max probably get 20 percent longer battery life than their corresponding 16 and 16 Plus counterparts, and between 10–15 percent over last year’s Pro models, in practical day-to-day use.
That alone might be worth a big chunk of the $200 price difference to some people.
I spent the weekdays last week running iOS 18.0; on Friday afternoon, I upgraded my 16 Pro review unit to the developer beta of iOS 18.1 (beta 3 at the time, since upgraded to beta 4). I’m sure many, if not most reviewers, will review only what comes in the box, and what’s coming in the box this week will be iOS 18.0 without any Apple Intelligence features.
That stance is fair enough, but I don’t see it as a big deal to include my 18.1 experience in this review. iOS 18.1 feels pretty close to shipping. Apple has promised “October”, and my gut feeling, using it for the last five days on this review unit, is that it’s pretty solid. I suspect it might ship closer to early October than late October. But even if it doesn’t appear until Halloween, I don’t think it’s absurd or offensive that Apple is already using Apple Intelligence to market the iPhone 16 lineup. It’s a little awkward right now, but it’s not a sham. It’s vaporware until it actually ships, but it’s vaporware that anyone with a developer account can install right now.
Also, none of the Apple Intelligence features currently in iOS 18.1 are game-changing. The Clean Up feature in Photos is pretty good, and when it doesn’t produce good results, you can simply revert to the original. The AI-generated summaries of messages, notifications, and emails in Mail are at times apt, but at others not so much. I haven’t tried the Rewrite tool because I’m, let’s face it, pretty confident in my own writing ability. But, after my own final editing pass, I ran this entire review through the Proofread feature, and it correctly flagged seven mistakes I missed, and an eighth that I had marked, but had forgotten to fix. Most of its suggestions that I have chosen to ignore were, by the book, legitimate. (E.g., it suggested replacing the jargon-y lede with the standard spelling lead. It also flagged my stubborn capitalization of “MacOS”.) It took 1 minute, 45 seconds to complete the proofreading pass of the 7,200+ words in Apple Notes on the iPhone 16 Pro. (Subsequent to the original publication of this review, I tried the Rewrite function on the text of it, for shits and giggles, and the only way I can describe the results is that it gave up.)
New Siri definitely offers a cooler-looking visual interface. And the new Siri voices sound more natural. But it also feels like Siri is speaking too slowly, as though Siri hails from the Midwest or something. (Changing Siri’s speaking rate to 110 percent in Settings → Accessibility → Siri sounds much more natural to my ears, and feels like it matches old Siri’s speaking rate.) Type to Siri is definitely cool, but I don’t see why we couldn’t have had that feature since 2010. I have actually used the new “Product Knowledge” feature, where Siri draws upon knowledge from Apple’s own support documentation, while writing this review. It’s great. But maybe Apple’s support website should have had better search years ago?
These are all good features. But let’s say you never heard of LLMs or ChatGPT. And instead, at WWDC this year, without any overarching “Apple Intelligence” marketing umbrella, Apple had simply announced features like a new cool-looking Siri interface, typing rather than talking to Siri, being able to remove unwanted background objects from photos, a “proofreading” feature for the standard text system that extends and improves the years-old but (IMO) kinda lame grammar-checking feature on MacOS, and brings it to iOS too? Those would seem like totally normal features Apple might add this year. But not tentpole features. These Apple Intelligence features strike me as nothing more than the sort of nice little improvements Apple makes across its OSes every year.
Apple reiterated throughout last week’s “It’s Glowtime” keynote, and now in its advertising for the iPhone 16 lineup, that these are the first iPhones “built for Apple Intelligence from the ground up”. I’m not buying that. These are simply the second generation of iPhone models with enough RAM to run on-device LLMs. LLMs are breakthrough technology. But they’re breakthroughs at the implementation level. The technology is fascinating and important, but so are things like the Swift programming language. I spent the first half of my time testing the iPhone 16 Pro running iOS 18.0 and the second half running 18.1 with Apple Intelligence. A few things got a little nicer. That’s it.
I might be underselling how impossible the Clean Up feature would be without generative AI. I am very likely underselling how valuable the new writing tools might prove to people trying to write in a second language, or who simply aren’t capable of expressing themselves well in their first language. But like I said, they’re all good features. I just don’t see them as combining to form the collective tentpole that Apple is marketing “Apple Intelligence” as. I get it that from Apple’s perspective, engineering-wise, it’s like adding an entire platform to the existing OS. It’s a massive engineering effort and the on-device execution constraints are onerous. But from a user’s perspective, they’re just ... features. When’s the last year Apple has not added cool new features along the scope of these?
Apple’s just riding — and now, through the impressive might of its own advertising and marketing, contributing to — the AI hype wave, and I find that a little eye-roll inducing. It would have been cooler, in an understated breathe-on-your-fingernails-and-polish-them-on-your-shirt kind of way, if Apple had simply added these same new features across their OSes without the marketing emphasis being on the “Apple Intelligence” umbrella. If not for the AI hype wave the industry is currently caught in, this emphasis on which features are part of “Apple Intelligence” would seem as strange as Apple emphasizing, in advertisements, which apps are now built using SwiftUI.
If the iPhone 16 lineup was “built from the ground up” with a purpose in mind, it’s to serve as the best prosumer cameras ever made. Not to create cartoon images of a dog blowing out candles on a birthday cake. The new lineup of iPhones 16 are amazing devices. The non-pro iPhone 16 and 16 Plus arguably offer the best value-per-dollar of any iPhones Apple has ever made. This emphasis on Apple Intelligence distracts from that.
The problem isn’t that Apple is marketing Apple Intelligence a few weeks before it’s actually going to ship. It’s that few of these features are among the coolest or most interesting things about the new iPhone 16 lineup, and none are unique advantages that only Apple has the ability or inclination to offer.7 Every phone on the market will soon be able to generate impersonal saccharine passages of text and uncanny-valley images via LLMs. Only Apple has the talent and passion to create something as innovative and genuinely useful as Camera Control. ★
While I’m reminiscing, allow me to reiterate my belief that the icon on the iPhone Home button is the single greatest icon ever designed. In my 2017 review of the iPhone X, I wrote:
↩︎The fundamental premise of iOS Classic is that a running app gets the entire display, and the Home button is how you interact with the system to get out of the current app and into another. Before Touch ID, the Home button was even labeled with a generic empty “app” icon, an iconographic touch of brilliance. [...]
I find it hard to consider a world where that button was marked by an icon that looked like a house (the overwhelmingly common choice for a “home” icon) or printed with the word “HOME” (the way iPods had a “MENU” button). Early iPhone prototypes did, in fact, have a “MENU” label on the button.
I truly consider the iPhone Home button icon the single best icon ever. It perfectly represented anything and everything apps could be — it was iconic in every sense of the word.
It’s almost unfathomable how much of a pain in the ass voicemail was before the iPhone. Rather than manage messages on screen, you placed a phone call to your carrier and interfaced with their system by punching number buttons. You had to deal with each message sequentially. “Press 1 to play, 2 to go to the next message, 7 to delete.” And you had to actually listen to the messages to know who they were from. It was horrible. ↩︎︎
Unless, I suppose, you live in the EU and have exercised your hard-earned right to delete it. ↩︎︎
That’s the only way to launch visual intelligence, which means the feature is exclusive to the iPhone 16 lineup and won’t be available on iPhone 15 Pros. I’m truly looking forward to this feature, so that’s a bummer for iPhone 15 Pro owners. ↩︎︎
Here’s Apple’s brief documentation for the old Photographic Styles feature (iPhones 13, 14, 15) and the new version (iPhones 16). ↩︎︎
Jet black aluminum is back, and as Vader-esque as it was on the iPhone 7 in 2016, with a new colorway for the Apple Watch Series 10 this year. I have a review unit in jet black on my wrist and it’s so great. ↩︎︎
It’s fair to argue that Private Cloud Compute is uniquely Apple. Not that Apple is the only company that could build out such an infrastructure for guaranteed-private off-device AI processing, but among the few companies that could do it, Apple is the only one that cares so deeply about privacy that they would. I do not expect Private Cloud Compute to be replicated by Google, Samsung, Meta, Amazon, or Microsoft. Nor any of the AI startups like OpenAI or Anthropic. They simply don’t care enough to do it the hard way. Apple does. But that belongs in the marketing for Apple’s ongoing Privacy campaign, not for the iPhones 16 in particular. ↩︎︎