23andMe Confirms Hackers Stole Ancestry Data on 6.9 Million Users 

Lorenzo Franceschi-Bicchierai, reporting for TechCrunch:

On Friday, genetic testing company 23andMe announced that hackers accessed the personal data of 0.1% of customers, or about 14,000 individuals. The company also said that by accessing those accounts, hackers were also able to access “a significant number of files containing profile information about other users’ ancestry.” But 23andMe would not say how many “other users” were impacted by the breach that the company initially disclosed in early October.

As it turns out, there were a lot of “other users” who were victims of this data breach: 6.9 million affected individuals in total.

In an email sent to TechCrunch late on Saturday, 23andMe spokesperson Katie Watson confirmed that hackers accessed the personal information of about 5.5 million people who opted-in to 23andMe’s DNA Relatives feature, which allows customers to automatically share some of their data with others. The stolen data included the person’s name, birth year, relationship labels, the percentage of DNA shared with relatives, ancestry reports and self-reported location.

Here’s a real shocker: 23andMe has updated their terms of service in attempt to prevent a class action lawsuit. Good luck with that.

Apple Requires Only a Subpoena to Turn Over Push Notification Tokens to Law Enforcement; Google Requires a Court Order 

Drew Harwell, reporting for The Washington Post:

Apple said in a statement that “the federal government had prohibited us from sharing any information” about the requests and now that the method had become public, it was updating its upcoming transparency reports to “detail these kinds of requests.”

Apple’s Law Enforcement Guidelines, the company’s rules for how police and government investigators should seek user information, now note that a person’s Apple ID, associated with a push-notification token, can be “obtained with a subpoena or greater legal process.”

Neither Wyden nor Apple detailed how many notifications had been reviewed, who had been targeted, what crimes were being investigated or which governments had made the requests.

Law enforcement agents can issue subpoenas on their own, so there’s no oversight here. Google, on the other hand, requires a court order:

For U.S. requests of push notifications and other non-content information, Google said it requires a court order, not just a subpoena, that is subject to judicial oversight. With such orders, federal officials must persuade a judge that the requested data is relevant and material to an ongoing criminal probe.

Score one for Google here.

Senator Ron Wyden: Governments Are Spying on Apple and Google Users Through Push Notifications 

Raphael Satter, reporting for Reuters:

Unidentified governments are surveilling smartphone users via their apps’ push notifications, a U.S. senator warned on Wednesday. In a letter to the Department of Justice, Senator Ron Wyden said foreign officials were demanding the data from Alphabet’s Google and Apple. Although details were sparse, the letter lays out yet another path by which governments can track smartphones. [...]

In a statement, Apple said that Wyden’s letter gave them the opening they needed to share more details with the public about how governments monitored push notifications. “In this case, the federal government prohibited us from sharing any information,” the company said in a statement. “Now that this method has become public we are updating our transparency reporting to detail these kinds of requests.”

Google said that it shared Wyden’s “commitment to keeping users informed about these requests.”

From Wyden’s letter to Attorney General Merrick Garland:

Apple and Google should be permitted to be transparent about the legal demands they receive, particularly from foreign governments, just as the companies regularly notify users about other types of government demands for data. These companies should be permitted to generally reveal whether they have been compelled to facilitate this surveillance practice, to publish aggregate statistics about the number of demands they receive, and unless temporarily gagged by a court, to notify specific customers about demands for their data. I would ask that the DOJ repeal or modify any policies that impede this transparency.

See also: Joseph Cox, reporting at 404 Media: “Here’s a Warrant Showing the U.S. Government is Monitoring Push Notifications”.

The Standalone iTunes Movies and TV Shows Apps Are Discontinued in tvOS 17.2 

Benjamin Mayo, 9to5Mac:

As first reported in October, Apple will discontinue the standalone iTunes Movies and iTunes TV Shows apps on the Apple TV box, starting with tvOS 17.2 The warning message seen above has started appearing in the release candidate version of tvOS 17.2 beta, released yesterday.

Apple directs users to the TV app instead to manage their purchases, and buy and rent from the store. At least as far as Apple’s video content is concerned, the iTunes brand is on the way out.

Apple has updated the TV app in 17.2 in preparation of the migration away from the standalone iTunes videos app, bringing across some functionality that was previously missing in TV. That includes things like filtering by genre in purchased tab, and the inclusion of box sets in the store listings. The TV app also features a new sidebar design in this update, which includes a dedicated store and purchases tab for quick navigation.

It’s the updates to the TV app that make this possible. It’s a good simplification overall: Apple’s own content — both iTunes purchases and TV+ streaming content — is in the TV app.

Gurman Predicts Big March for Apple: New iPads Pro and Air, M3 MacBook Airs, and New iPad Peripherals 

Mark Gurman, reporting for Bloomberg:

The iPad Air, which is the company’s mid-tier tablet, currently comes with a 10.9-inch screen. For next year’s release, the company will add a version that’s about 12.9 inches, matching the size of what’s currently the biggest iPad Pro.

The company is also preparing revamped versions of the Apple Pencil and Magic Keyboard accessories, which it will sell alongside the new iPad Pro. The new Pencil — codenamed B532 — will represent the third generation of the product. The company released a new low-end model in November.

The new Magic Keyboards — codenamed R418 and R428 — will make the iPad Pro look more like a laptop and include a sturdier frame with aluminum.

A big iPad Air is interesting, and I suspect will prove popular. No word, alas, on a new iPad Mini though. (I wish Apple would drop the “Mini” brand and just make the iPad Air in three sizes: mini, regular, and large, with identical specs.)

Gurman offers no details about the form factor for the updated iPad Pro models. Given that last year’s 10th-generation regular iPad moved the front-facing camera to the long side of the device — the appropriate location for a camera when the iPad is being used laptop-style — it seems like a safe guess that Apple will do the same with these next-gen iPad Air and Pro models. But the spot where that camera would go is currently the same spot where current iPad Pros have the magnetic attachment for a 2nd-gen Apple Pencil. So I think that’s why Apple is going to introduce a 3rd-gen Pencil — they might need an altogether new way of pairing, charging, and attaching Pencils if they move the front-facing camera to the long side. (Well, that’s one reason to create a 3rd-gen Pencil. Other reasons, of course, would include various ways of making a better stylus — the current 2nd-gen Pencil is now over 5 years old.)

I’m also quite curious about the purported reimagined Magic Keyboards. The current ones are transformative for iPads, functionally, but the rubbery surface material just isn’t durable enough — especially the white ones. MacBooks are remarkably durable; iPad Magic Keyboards demand to be treated carefully. On mine, the rubber is peeling away around my most-used keys. That shouldn’t happen with any keyboard, but it definitely shouldn’t happen with one that costs $300-350.

Bloomberg: ‘Apple Set to Avoid EU Crackdown Over iMessage Service’ 

Samuel Stolton, reporting for Bloomberg:*

Apple Inc.’s iMessage service looks set to win a carve out from new European Union antitrust rules to rein in Big Tech platforms after watchdogs tentatively concluded that it isn’t popular enough with business users to warrant being hit by the regulation. [...]

In order to fall under the scope of the rules, a service must be deemed an “important gateway” for business users. EU enforcers now consider this is not the case for iMessage, according to the people.

If iMessage ended up being targeted by the Digital Markets Act, Apple would have faced potentially onerous obligations to make iMessage work with rival online messaging services, such as Meta Platforms Inc.’s WhatsApp or Facebook Messenger — a move that Apple has already strongly contested.

The elephant in the room with this particular issue is that the interoperability demands of the DMA between E2EE messaging platforms make no technical sense whatsoever. It’s all just hand-waving on the part of the EU bureaucrats who are demanding it. They have no idea what E2EE really means. They just want to demand that a WhatsApp user should be able to send a message to someone on iMessage or Facebook Messenger. Just make it happen.

Who would run key exchange, and manage the discovery and distribution of said keys, for E2EE messages sent across platforms? Key exchange and discovery is essential, and a difficult problem to solve within each platform itself. I think it’s impossible across platforms. Within each platform, the platform owner is in charge and handles these things. With this EU fantasy of mandatory interop across messaging platforms, who would be in charge?

Apple getting exempted from this, I think, will mainly benefit Apple by letting them ignore an impossible mandate. I don’t think this interop will ever come to fruition, no matter what the EU demands, because I don’t think it can, nor do I think it should. Would be nice to just avoid the debate.

* You know.

Thieves Rob D.C. Uber Eats Driver, Steal Her Car, But Reject Android Phone 

Carl Willis, reporting for ABC 7News in Washington D.C.:

“As soon as he parked the car two masked gentlemen came up to him, armed,” she said. “They robbed him, took everything he had in his pockets, took the keys to my truck and got in and pulled off.”

She said one of them approached on foot in the 2400 block of 14th Street, NW. The other was in a black BMW, both of them armed with guns. She said the robbers were bold taking her husband’s phone, but then giving it back because it wasn’t to their liking.

“They basically looked at that phone and was like ‘Oh, that’s an Android? We don’t want this. I thought it was an iPhone,’” she said.

Leave the Android, take the cannoli.

Bending Spoons, the Parent Company That Now Owns — and Laid Off the Staff of — Filmic 

The Impassioned Moderate, a year ago:

News came out a few weeks ago that Bending Spoons, a consumer app studio, raised a massive $340 million round of financing. The press gushed about it: “Hollywood star, tech execs invest in Italian start-up Bending Spoons”, “Ryan Reynolds invests in ‘terrifying’ Italian start-up Bending”. And Ryan himself said things that are just so easy to imagine him saying (a testament to the spectacular job he’s done branding himself): “Their apps enable anyone to become a creative genius with minimum effort. In fact, their products terrify me so much, I had to invest.” (Ironically - or not? - his ad agency is called Maximum Effort…)

The problem? Bending Spoons is the one the most predatory actors on the entire App Store - they’re terrifying in a completely different way.

Bending Spoons’s business model is to buy successful apps, change them to a weekly auto-renewing subscription model that perhaps tricks users into signing up, and using the revenue to buy more apps and repeat the cycle. Filmic, for example, now defaults to a $3/week subscription — over $150/year. To be fair, there’s also a $40/year subscription.

It doesn’t seem like a scam, per se, but it doesn’t seem like a product-driven company. Apps seemingly don’t thrive after acquisition by Bending Spoons — instead, they get bled dry. There are some apps where a weekly subscription makes sense — Flighty comes to mind, for occasional travelers — but a camera app? Feels deceptive.

Bending Spoons is a big company with a lot of revenue and that spends a lot of money on App Store and Play Store search ads. (Here’s Tim Cook visiting their office last year.)

Kino: Forthcoming Video Camera App for iPhone From the Makers of Halide 

The timing is surely coincidental with regard to the news about Filmic, but, as they say, fortune favors the prepared.

Filmic’s Entire Staff Laid Off by Parent Company Bending Spoons 

Jaron Schneider, reporting for PetaPixel:

Filmic, or FiLMiC as written by the brand, no longer has any dedicated staff as parent company Bending Spoons has laid off the entire team including the company’s founder and CEO, PetaPixel has learned. Considered for years as the best video capture application for mobile devices, the team behind Filmic Pro and presumably Filmic Firstlight — the company’s photo-focused app — has been let go. [...]

It is unclear what Bending Spoons intends to do with Filmic Pro or Filmic Firstlight, but there were early signs of trouble when the company’s most recent major update was last year. The most recent notable update to Filmic Pro came in October which brought support for Apple Log into the app, but there was no mention of the addition of external SSD support, odd considering that Filmic Pro had a strong track record for updating its platform to work with all of the new iPhone updates — especially those that are particularly important for video.

In Filmic’s absence, Blackmagic Design’s iOS app has become the most popular way to capture footage with the new iPhones and was used by Apple’s in-house team for the production of its Mac event on October 31.

Christina Warren, on Threads:

Hate this but I’m sadly not at all surprised. Filmic has an incredible product they were afraid to charge for and when they finally changed pricing models, it was too little too late and users rebelled. If they had been charging $100 a year or even upfront in 2015, I think they could have survived without selling to the Bending Spoons vultures. But now they’ve got a subscription app that isn’t actively improving and free competition from Black Magic who uses their apps as loss leaders. Hate it.

Filmic was featured by Apple in numerous iPhone keynotes and App Store promotions over the years — for a long stretch it was undeniably the premier “pro” video camera app for iPhones.

India Is Considering EU-Style Charger Rules That Would Block Older iPhones From Sale 

Aditya Kalra and Munsif Vengattil, reporting for Reuters from New Delhi:

India wants to implement a European Union rule that will require smartphones to have a universal USB-C charging port, and has been in talks with manufacturers about introducing the requirement in India by June 2025, six months after the deadline in the EU. While all manufacturers including Samsung have agreed to India’s plan, Apple is pushing back. [...]

In a closed-door Nov. 28 meeting chaired by India’s IT ministry, Apple asked officials to exempt existing iPhone models from the rules, warning it will otherwise struggle to meet production targets set under India’s production-linked incentive (PLI) scheme, according to the meeting minutes seen by Reuters. [...]

In terms of market share, Apple accounts for 6% of India’s booming smartphone market, compared with just about 2% four years ago. Apple suppliers have expanded their facilities and make most iPhone 12, 13, 14 and 15 models in India for local sales and exports, Counterpoint Research estimates. Only iPhone 15 has the new universal charging port. Apple told Indian officials in the meeting that the “design of the earlier products cannot be changed,” the document showed.

Consumers in India’s price-conscious market prefer buying older models of iPhones which typically become cheaper with new launches, and India’s push for the common charger on older models could hit Apple’s targets, said Prabhu Ram, head of the Industry Intelligence Group at CyberMedia Research. “Apple’s fortunes in India have primarily been tied to older generation iPhones,” he said.

I was under the impression that the EU’s USB-C requirement will only apply to new devices, but maybe not? A plain reading of this EU press release suggests that all phones sold, starting in 2025, must have USB-C charging ports:

By the end of 2024, all mobile phones, tablets and cameras sold in the EU will have to be equipped with a USB Type-C charging port. From spring 2026, the obligation will extend to laptops.

That would mean, starting in January 2025, that the only iPhones available in the EU will be this year’s iPhones 15 and next year’s iPhones 16. A new fourth-generation iPhone SE with USB-C would give Apple a much-needed lower-priced model. The second-gen SE came in 2020; the current third-gen SE in 2022.

See also: Ben Lovejoy at 9to5Mac.

An AppleScript for Safari: Split Tabs to New Window 

I finally got around to scratching a longstanding itch. I’m an inveterate web browser tab hoarder, and a scenario I frequently encounter is wanting to move the most recent (typically, rightmost) tabs into a new window all by themselves. Let’s say, for example, I have 26 tabs open in the frontmost Safari window, A through Z. The current selected tab is X. This script will move tabs X, Y, and Z to a new window, leaving tabs A through W open in the old window. It starts with the current tab, and moves that tab and those to the right.

I have the script saved in my FastScripts scripts folder for Safari, but I tend to invoke it from LaunchBar (which I have configured to index my entire scripts folder hierarchy). Command-Space to bring up LaunchBar, type “spl” to select this script, hit Return, done.

I have no idea how many others might want this, but in recent years here at DF I’ve gotten away from sharing my occasional scripting hacks, and feel like I ought to get back to sharing them. Can’t let Dr. Drang have all the fun.

Update: Leon Cowle adapted my script to be more elegant and concise. If you’re using this but grabbed the script before 10:30pm ET, go back and re-grab it.

iCloud Advanced Data Protection Uptake Amongst DF Readers 

Back in August I ran a poll on Mastodon, asking my followers if they have iCloud Advanced Data Protection enabled. iCloud Advanced Data Protection was announced two years ago this week, alongside support for security keys (e.g. Yubico). The results, from 2,304 responses:

  • Yes: 29%
  • No: 59%
  • No, but would if not for device(s) with old OSes: 12%

Count me in that last group. I’ve got a handful of old devices that I still use which can’t be updated to an OS version that supports the feature. But one of these days I’ll just sign out of iCloud on those devices and enable this.

As ever when I run polls like this, it should go without saying that the Daring Fireball audience is not representative of the general public. The results for this poll — with nearly 30 percent of responders having an esoteric security feature enabled — exemplify that.

‘The Lost Voice’ 

One of Apple’s latest accessibility features is Personal Voice — for people who are “at risk of voice loss or have a condition that can progressively impact your voice”, Personal Voice lets you create a voice that sounds like you.

The Lost Voice is a two-minute short film directed by Taika Waititi celebrating this feature. It’s a splendid, heartwarming film, and it’s especially remarkable to see so much effort, such remarkable production values and filmmaking talent, being applied to marketing a feature for a tiny fraction of Apple’s users. Most people do not need this feature. But for those who do, it seems life-altering. Genuinely profound.

Apple at its very best.

See also: Shelly Brisbin at Six Colors.

First Trailer for ‘Grand Theft Auto VI’ 

Three thoughts:

  • I did not expect to hear a Tom Petty song in a GTA trailer, but I love it. It works. (Hard to escape the feeling though that the Petty estate is willing to sell songs in ways Petty himself wouldn’t have.)

  • The game looks amazing.

  • “Coming 2025”! Holy smokes, this game has been in development for a decade. (GTA 5 came out in late 2013 and has sold 190 million copies and generated over $8 billion.)

Software Applications Incorporated 

You’ve probably seen Infinite Mac, the web-based emulator of classic Mac OS, before. But Software Inc. — a new company from some of the people behind Workflow, which became Shortcuts after acquisition by Apple — used it to create their company website, and it’s delightful.

Kolide 

My thanks to Kolide for sponsoring last week at DF. Getting OS updates installed on end user devices should be easy. After all, it’s one of the simplest yet most impactful ways that every employee can practice good security. On top of that, every MDM solution promises that it will automate the process and install updates with no user interaction needed. Yet in the real world, it doesn’t play out like that. Users don’t install updates and IT admins won’t force installs via forced restart.

With Kolide, when a user’s device — be it Mac, Windows, Linux, or mobile — is out of compliance, Kolide reaches out to them with instructions on how to fix it. The user chooses when to restart, but if they don’t fix the problem by a predetermined deadline, they’re unable to authenticate with Okta.

Watch Kolide’s on-demand demo to learn more about how it enforces device compliance for companies with Okta.

The Talk Show: ‘The Blurry Edge of Acceptable’ 

Nilay Patel returns to the show. Topics include the iPhones 15, journalism in the age of AI, and what it’s like to have Barack Obama on your podcast.

Sponsored by:

  • Trade Coffee: Let’s coffee better. Get a free bag of fresh coffee with any Trade subscription.
  • Squarespace: Save 10% off your first purchase of a website or domain using code talkshow.
  • Nuts.com: The world’s best snacks, delivered fast and fresh.
Maybe It Was a Panoramic Photo 

Faruk Korkmaz posits a seemingly likely explanation for that “computational photography glitch in a bridal shop” photo: it was taken in Panoramic mode. The subject claims it wasn’t a Panoramic mode photo, but she didn’t snap the photo, and if a photo taken in Panoramic mode isn’t wide enough to reach some threshold, the Photos app does not identify/badge it as such. And conversely, a normal photograph cropped to a very wide aspect ratio will be badged as Panoramic — like this and this from my own library — even though it wasn’t snapped in Panoramic mode.

I think it’s quite likely Korkmaz is correct that this is the explanation for how this photo was created; I remain unconvinced that it wasn’t a deliberate publicity stunt.

‘Voice of a Star Wars Fan’ 

This is just an astonishing 20-minute film by Hiroshi Sumi. An homage and loving look back at the earliest days of Industrial Light and Magic. I don’t want to say much more than that lest I spoil the wonder of it. I don’t know why anyone would exert so much effort to make something like this but I’m so inordinately delighted that Sumi did. It speaks to the power of obsession.

After you watch it, take a look at this tweet from Sumi, and this prototype rendering from three years ago.

Just amazing. So much obvious love. (Via Todd Vaziri.)

CNBC Gets an Inside Look at an Apple Chip Lab 

Katie Tarasov, CNBC:

In November, CNBC visited Apple’s campus in Cupertino, California, the first journalists allowed to film inside one of the company’s chip labs. We got a rare chance to talk with the head of Apple silicon, Johny Srouji, about the company’s push into the complex business of custom semiconductor development, which is also being pursued by Amazon.

“We have thousands of engineers,” Srouji said. “But if you look at the portfolio of chips we do: very lean, actually. Very efficient.”

Can’t say there’s any news in this, but it’s neat to see inside the chip-testing lab. (Same video is available on YouTube, too, if that’s your jam.)

Amazon’s Fire TV Is Adding Full-Screen Video Ads That Play When You Start Your Fire TV 

Luke Bouma, writing for Cord Cutters:

Today, Cord Cutters News has confirmed that Amazon is adding full-screen video ads that will play when you start your Fire TV unless you quickly perform an action on it.

This new update will be rolling out to all Fire TVs made in 2016 or newer. With this update, the ad at the top of your Fire TV will now start playing full-screen, often promoting a movie or TV show. By hitting the home button, you can quickly exit the ad or if you quickly perform an action on the Fire TV once it finishes, you will avoid the video ad, but you only have a few seconds.

“Our focus is on delivering an immersive experience so customers can enjoy their favorite TV shows and movies, as well as browse and discover more content they’ll want to watch. We’re always working to make the Fire TV experience better for customers and have updated one of the prominent placements in the UI to play a short content preview if no other action is taken by a customer upon turning on their Fire TV.” Amazon said in a statement to Cord Cutters News.

What a load of horseshit from Amazon in that statement. Autoplaying ads aren’t “immersive”. And this is in no way “working to make the Fire TV experience better for customers”. Working to make things better would mean getting rid of shit like this, not adding it.

I really don’t understand how anyone uses anything but an Apple TV box. Apple TV is far from perfect but holy hell, it really does start from the perspective of respecting you, the user. The people at Apple who make it are obviously trying to create the experience that they themselves want when they’re watching TV at home.

Calling ‘Fake’ on the ‘iPhone Computational Photography Glitch in a Bridal Shop’ Viral Photo 

Wesley Hillard, self-described “Rumor Expert”, writing at AppleInsider:

A U.K. comedian and actor named Tessa Coates was trying on wedding dresses when a shocking photo of her was taken, according to her Instagram post shared by PetaPixel. The photo shows Coates in a dress in front of two mirrors, but each of the three versions of her had a different pose.

One mirror showed her with her arms down, the other mirror showed her hands joined at her waist, and her real self was standing with her left arm at her side. To anyone who doesn’t know better, this could prove to be quite a shocking image.

To the contrary, to anyone who “knows better”, this image clearly seems fake. But it’s a viral sensation:

Coates, in her Instagram description, claims “This is a real photo, not photoshopped, not a pano, not a Live Photo”, but I’m willing to say she’s either lying or wrong about how the photo was taken. Doing so feels slightly uncomfortable, given that the post was meant to celebrate her engagement, but I just don’t buy it. These are three entirely different arm poses, not three moments in time fractions of a second apart — and all three poses in the image are perfectly sharp. iPhone photography just doesn’t work in a way that would produce this image. I’d feel less certain this was a fake if there were motion blur in the arms in the mirrors. You can get very weird-looking photos from an iPhone’s Pano mode, but again, Coates states this is not a Pano mode image. (Perhaps you can generate an image like this using a Google Pixel 8’s Best Take feature, but this is purportedly from an iPhone, which doesn’t have a feature like that. And even with Best Take, that’s a feature you invoke manually, using multiple original images as input. I don’t think any phone camera, let alone an iPhone, produces single still images such as this.)

In a thread on Threads, where several commenters are rightfully skeptical:

  • Tyler Stalman (who hosts a great podcast on photography and videography):

    Any iPhone photographer can confirm that this is not an image processing error, it would never look like this.

  • David Imel (a writer/researcher for MKBHD):

    I really, REALLY do not think this is a real image. HDR on phones takes 5-7 frames with split-second exposure times. Whole process like .05 sec. Even a live photo is < 2 seconds.

    Even if the phone thought they were diff people it wouldn’t stitch like this and wouldn’t have time.

    This is spreading everywhere and it’s driving me insane.

I challenge anyone who thinks this is legit to produce such an image using an iPhone with even a single mirror in the scene, let alone two. If I’m wrong, let me know.

Update 1: Claude Zeins takes me up on my challenge.

Update 2: In a long-winded story post, Coates says she went to an Apple Store for an explanation and was told by Roger, the “grand high wizard” of Geniuses at the store, that Apple is “beta testing” a feature like Google’s Best Take. Which is not something Apple does, and if they did do, would require her to have knowingly installed an iOS beta.

Update 3: Best theory to date: it was, despite Coates’s claim to the contrary, taken in Panoramic mode.

Podcast App Castro Might Be Dying 

Jason Snell:

Castro has been a popular iOS podcast app for many years, but right now things look grim.

The cloud database that backs the service is broken and needs to be replaced. As a result, the app has broken. (You can’t even export subscriptions out of it, because even that function apparently relies on the cloud database.) “The team is in the progress of setting up a database replacement, which might take some time. We aim to have this completed ASAP,” said an Xtweet from @CastroPodcasts.

What’s worse, according to former Castro team member Mohit Mamoria, “Castro is being shut down over the next two months.”

I always appreciated Castro — it’s a well-designed, well-made app that embraced iOS design idioms. But as a user it just never quite fit my mental model for how a podcast client should work, in the way that Overcast does. I wanted to like Castro more than I actually liked it.

As a publisher, Castro was the 4th or 5th most popular client for The Talk Show for a while, but in recent years has slipped. Right now it’s 10th — but in a logarithmic curve. Overcast remains 1st; Apple Podcasts 2nd. The truth is, if not for Overcast, Castro would likely be in that top position, not shutting down. But Overcast does exist, and it’s the app where most people with exquisite taste in UI are listening to podcasts. There aren’t many markets where listeners of The Talk Show are in the core demographic, but iOS podcast apps are one. I can’t say why or precisely when, but somewhere along the line Castro lost its mojo.

I salute everyone who’s worked on it, though, because it really is a splendid app.

MacOS Security Prompts Need a Rethinking 

Jason Snell, writing at Six Colors:

Last month I wrote about how Apple’s cascade of macOS alerts and warnings ruin the Mac upgrade experience. [...]

This issue was brought home to me last week when I was reviewing the M3 iMac and the M3 MacBook Pro. As a part of reviewing those computers, I used Migration Assistant to move a backup of my Mac Studio to the new systems via a USB drive. Sometimes I try to review a computer with nothing migrated over, but it can be a real slowdown and I didn’t really have any time to spare last week.

Anyway, by migrating, I got to (twice) experience Apple’s ideal process of moving every user from one Mac to the next. You start up your new computer, migrate from a backup of the old computer, and then start using the new one. There’s a lot that’s great about this process, and it’s so much better than what we used to have to do to move files over from one Mac to another.

And yet all of Apple’s security alerts got in the way again and spoiled the whole thing. Here’s a screenshot I took right after my new Mac booted for the first time after migration.

I went through the exact same thing. Except if I had taken a screenshot of all the security-permission alerts I had to go though, there would have been more of them — and Snell’s screenshot looks like a parody. Back in the heyday of the “Get a Mac” TV ad campaign, Apple justifiably lambasted Windows Vista for its security prompts, but that’s exactly the experience you get after running Migration Assistant on a Mac today. It’s terrible.

Don’t get me wrong: Migration Assistant is borderline miraculous. It’s a wonderful tool that seemingly just keeps getting better. But MacOS itself stores too many security/privacy settings in a way that are tied to the device, not your user account. There ought to be some way to OK all these things in one fell swoop.

As Snell says, setting up a new Mac should be a joy, not a chore. Migration Assistant takes care of so much, but these cursed security prompts spoil the experience.

The Perils of Charging for Emergency Services 

Kyle Melnick, reporting last week for The Washington Post under the headline “A Toddler Was Taken in a Carjacking; VW Wanted $150 for GPS Coordinates, Lawsuit Says”:

Shepherd, who was four months pregnant, tried to fight off the man. But she was thrown to the pavement and run over by her own car as the man drove away with Isaiah in the back seat, authorities said. Shepherd thought she might never see her son again.

After Shepherd frantically called 911, investigators contacted Volkswagen’s Car-Net service, which can track the location of the manufacturer’s vehicles. They hoped to locate Isaiah.

But a customer service representative said that wouldn’t be possible because Shepherd’s subscription to the satellite service had expired, according to a new lawsuit. The employee said he couldn’t help until a $150 payment was made, the complaint said.

This perfectly illustrates the perils of Apple eventually charging for Emergency SOS satellite service. If Apple someday cuts off free service for compatible iPhones, eventually there’s going to be someone who dies because they chose not to pay to continue service. No one wants that.

Apple Extends Emergency SOS via Satellite for an Additional Free Year 

Apple Newsroom, two weeks ago:

One year ago today, Apple’s groundbreaking safety service Emergency SOS via satellite became available on all iPhone 14 models in the U.S. and Canada. Now also available on the iPhone 15 lineup in 16 countries and regions, this innovative technology — which enables users to text with emergency services while outside of cellular and Wi-Fi coverage — has already made a significant impact, contributing to many lives being saved. Apple today announced it is extending free access to Emergency SOS via satellite for an additional year for existing iPhone 14 users.

My hunch on this is that Apple would like to make this available free of charge in perpetuity, but wasn’t sure how much it would actually get used, and thus how much it would actually cost. If they come right out and say it’s free forever, then it needs to be free forever. It’s safer to just do what they’ve done here: make it free for an extra year one year at a time, and see how it goes as more and more iPhones that support the feature remain in active use.

It’s a wonderful feature — quite literally life-saving in numerous cases — but it’d be hard to sell. It’s like buying insurance. People like paying for stuff they want to use, not for stuff they hope they never need. Obviously, people do buy insurance — Apple itself, of course, sells AppleCare — but how many people would pay extra for Emergency SOS? If Apple can just quietly eat the cost of this service, they should, and I think will.

Charlie Munger, Warren Buffett’s Partner and Vice Chairman of Berkshire Hathaway, Dies at 99 

Andrew Ross Sorkin and Robert D. Hershey Jr., reporting for The New York Times:

Charles T. Munger, who quit a well-established law career to be Warren E. Buffett’s partner and maxim-spouting alter-ego as they transformed a foundering New England textile company into the spectacularly successful investment firm Berkshire Hathaway, died on Tuesday in Santa Barbara, Calif. He was 99.

His death, at a hospital, was announced by Berkshire Hathaway. He had a home in Los Angeles.

Although overshadowed by Mr. Buffett, who relished the spotlight, Mr. Munger, a billionaire in his own right — Forbes listed his fortune as $2.6 billion this year — had far more influence at Berkshire than his title of vice chairman suggested.

Mr. Buffett has described him as the originator of Berkshire Hathaway’s investing approach. “The blueprint he gave me was simple: Forget what you know about buying fair businesses at wonderful prices; instead, buy wonderful businesses at fair prices,” Mr. Buffett once wrote in an annual report. [...]

A $1,000 investment in Berkshire made in 1964 is worth more than $10 million today.

Mr. Munger was often viewed as the moral compass of Berkshire Hathaway, advising Mr. Buffett on personnel issues as well as investments. His hiring policy: “Trust first, ability second.”

A new edition of Munger’s book of aphorisms, Poor Charlie’s Almanack — its title an allusion to Munger’s idol, Benjamin Franklin — is due next week.

WSJ Reports That Apple and Goldman Sachs Are Parting Ways on Apple Card 

AnnaMaria Andriotis, reporting for The Wall Street Journal (News+):

Apple is pulling the plug on its credit-card partnership with Goldman Sachs, the final nail in the coffin of the Wall Street bank’s bid to expand into consumer lending.

The tech giant recently sent a proposal to Goldman to exit from the contract in the next roughly 12-to-15 months, according to people briefed on the matter. The exit would cover their entire consumer partnership, including the credit card the companies launched in 2019 and the savings account rolled out this year.

It couldn’t be learned whether Apple has already lined up a new issuer for the card.

Apple Card is a strange product — everyone I know who has one likes it (including me), but Goldman itself has reported that they’ve lost $3 billion since 2020 on it. The savings accounts are a hit with customers too.

American Express is rumored to be one possible partner, but it would be pretty strange for Apple Cards to transmogrify from MasterCard to Amex cards overnight. There are still a lot of businesses — particularly throughout Europe — that accept MasterCard but not Amex. It’s not just that Apple Card would no longer be accepted at businesses where previously it was, but that would highlight the fact that Apple Card is really just an Apple-branded card issued by a company that isn’t Apple. Apple wants you to think of Apple Card as, well, an Apple credit card.

Ian Hickson: ‘Reflecting on 18 Years at Google’ 

Ian Hickson, who recently left Google after an 18-year stint:

The lack of trust in management is reflected by management no longer showing trust in the employees either, in the form of inane corporate policies. In 2004, Google’s founders famously told Wall Street “Google is not a conventional company. We do not intend to become one.” but that Google is no more.

Much of these problems with Google today stem from a lack of visionary leadership from Sundar Pichai, and his clear lack of interest in maintaining the cultural norms of early Google. A symptom of this is the spreading contingent of inept middle management. [...]

It’s definitely not too late to heal Google. It would require some shake-up at the top of the company, moving the centre of power from the CFO’s office back to someone with a clear long-term vision for how to use Google’s extensive resources to deliver value to users. I still believe there’s lots of mileage to be had from Google’s mission statement (“to organize the world’s information and make it universally accessible and useful”). Someone who wanted to lead Google into the next twenty years, maximising the good to humanity and disregarding the short-term fluctuations in stock price, could channel the skills and passion of Google into truly great achievements.

I do think the clock is ticking, though. The deterioration of Google’s culture will eventually become irreversible, because the kinds of people whom you need to act as moral compass are the same kinds of people who don’t join an organisation without a moral compass.

This jibes with my perception of Google from the outside. Early Google did two things great:

  • They introduced a steady stream of groundbreaking new products and services that served their mission statement, with broad appeal to the public.
  • They ruthlessly fought against bloat and feature creep in the products they had already shipped.

Neither of those things has been true in recent years, and the responsibility clearly falls on Pichai.

Sports Illustrated Published Painfully Bad Articles by Fake AI-Generated Writers 

Maggie Harrison, writing for Futurism:

The only problem? Outside of Sports Illustrated, Drew Ortiz doesn’t seem to exist. He has no social media presence and no publishing history. And even more strangely, his profile photo on Sports Illustrated is for sale on a website that sells AI-generated headshots, where he’s described as “neutral white young-adult male with short brown hair and blue eyes.”

Ortiz isn’t the only AI-generated author published by Sports Illustrated, according to a person involved with the creation of the content who asked to be kept anonymous to protect them from professional repercussions. “There’s a lot,” they told us of the fake authors. “I was like, what are they? This is ridiculous. This person does not exist.”

“At the bottom [of the page] there would be a photo of a person and some fake description of them like, ‘oh, John lives in Houston, Texas. He loves yard games and hanging out with his dog, Sam.’ Stuff like that,” they continued. “It’s just crazy.”

The AI authors’ writing often sounds like it was written by an alien; one Ortiz article, for instance, warns that volleyball “can be a little tricky to get into, especially without an actual ball to practice with.”

What an incredible fall from grace for what was, for decades, a truly great magazine. I can see how they thought they’d get away with it, though — Sports Illustrated’s human-written articles are now mostly clickbait junk anyway.

Details From Unsealed Lawsuit on Instagram Looking the Other Way at Preteen Accounts and Knowingly Serving Teens Harmful Content 

Tangentially related to the last item, here’s Eva Rothenberg reporting for CNN:

Since at least 2019, Meta has knowingly refused to shut down the majority of accounts belonging to children under the age of 13 while collecting their personal information without their parents’ consent, a newly unsealed court document from an ongoing federal lawsuit against the social media giant alleges. [...]

According to the 54-count lawsuit, Meta violated a range of state-based consumer protection statutes as well as the Children’s Online Privacy Protection Rule (COPPA), which prohibits companies from collecting the personal information of children under 13 without a parent’s consent. Meta allegedly did not comply with COPPA with respect to both Facebook and Instagram, even though “Meta’s own records reveal that Instagram’s audience composition includes millions of children under the age of 13,” and that “hundreds of thousands of teen users spend more than five hours a day on Instagram,” the court document states.

One Meta product designer wrote in an internal email that the “young ones are the best ones,” adding that “you want to bring people to your service young and early,” according to the lawsuit.

Not a good look.

The unsealed complaint also alleges that Meta knew that its algorithm could steer children toward harmful content, thereby harming their well-being. According to internal company communications cited in the document, employees wrote that they were concerned about “content on IG triggering negative emotions among tweens and impacting their mental well-being (and) our ranking algorithms taking [them] into negative spirals & feedback loops that are hard to exit from.”

On that last point, Jason Kint posted a long thread on Twitter/X highlighting previously redacted details from the lawsuit.

Wall Street Journal Investigation Shows Instagram Serving Skeevy Videos, Alongside Mainstream Ads, to Adults Who Follow Accounts Featuring Young Gymnasts and Cheerleaders 

Jeff Horwitz and Katherine Blunt, reporting for The Wall Street Journal:

The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform. Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos — and ads for some of the biggest U.S. brands.

The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.

In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff. In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl.

Worse, Meta has known of the Journal’s findings since August and the problem continues:

The Journal informed Meta in August about the results of its testing. In the months since then, tests by both the Journal and the Canadian Centre for Child Protection show that the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere.

As of mid-November, the center said Instagram is continuing to steadily recommend what the nonprofit described as “adults and children doing sexual posing.”

There’s no plausible scenario where Instagram wants to cater to pedophiles, but it’s seemingly beyond their current moderation capabilities to determine the content of videos at scale. Solving this ought to be their highest priority.

Kolide 

My thanks to Kolide for sponsoring last week at DF. Getting OS updates installed on end user devices should be easy. After all, it’s one of the simplest yet most impactful ways that every employee can practice good security. On top of that, every MDM solution promises that it will automate the process and install updates with no user interaction needed. Yet in the real world, it doesn’t play out like that. Users don’t install updates and IT admins won’t force installs via forced restart.

With Kolide, when a user’s device — be it Mac, Windows, Linux, or mobile — is out of compliance, Kolide reaches out to them with instructions on how to fix it. The user chooses when to restart, but if they don’t fix the problem by a predetermined deadline, they’re unable to authenticate with Okta.

Watch Kolide’s on-demand demo to learn more about how it enforces device compliance for companies with Okta.

Audio Hijack 4.3’s New Transcribe Block 

Rogue Amoeba:

Transcribe can convert speech from an astonishing 57 languages into text, providing you with a written transcript of any spoken audio. It’s powered by OpenAI’s automatic speech recognition system Whisper, and features two powerful models for fast and accurate transcriptions.

Best of all, unlike traditional transcription services, Transcribe works for free inside of Audio Hijack. There’s absolutely no on-going cost, so you can generate unlimited transcriptions and never again pay a per-minute charge. It’s pretty incredible.

It’s also completely private. When you use Transcribe, everything happens right on your Mac. That means your data is never sent to the cloud, nor shared with anyone else.

This makes for a perfect one-two shot with Retrobatch 2: Audio Hijack is also a node-based media tool (which predates Retrobatch), and this new Transcribe block is also putting powerful machine learning tools into an easily accessible form.

This Transcribe feature in Audio Hijack is also an exemplar of the power of Apple silicon — it works on Intel-based Macs too, but it’s just incredibly fast on Apple silicon (I suspect because of the Neural Engine on every M-series chip).

Retrobatch 2.0 

Gus Mueller, writing at the Flying Meat blog:

In case you’re not aware, Retrobatch is a node-based batch image processor, which means you can mix, match, and combine different operations together to make the perfect workflow. It’s kind of neat. And version 2 is even neater. [...]

Retrobatch is obviously not Flying Meat’s most important app (Acorn would fill that role), but I really do like working on it and there’s a bunch more ideas that I want to implement. I feel like Retrobatch is an app that the Mac needs, and it makes me incredibly happy to read all the nice letters I get from folks when they figure out how to use it in their daily work.

Five years after Retrobatch 1 shipped, I’m happy to see version 2 out in the world. And I can’t wait to see what folks are going to do with it.

“Node-based batch image processor” means that you design and tweak your own image processing workflows not with code, but through a visual drag-and-drop interface. (But you can use code, via nodes for JavaScript, AppleScript, and shell scripts.) You can program your own highly customized image processing workflows without knowing anything about writing code. It’s useful for creating workflows that work on just one image at a time, but Retrobatch really shines for batch processing.

There are a zillion new features in version 2, but the star of the show has to be the new “ML Super Resolution” 4× upscaler: a powerful machine learning model made easily accessible.

Teenage Engineering’s EP–133 K.O. II Sampler 

I can’t read or play music, and struggle even to clap to a beat, so I would have zero use for this device. But I still want to buy one. Just look at it. Absolutely gorgeous.

The Talk Show: ‘Two-Legged Stool’ 

Special guest Gabe Rivera, founder of the indispensable news aggregator Techmeme, joins the show to talk about the state of news and social media. Thanksgiving fun for the entire family — turn the volume down on the Packers-Lions game tomorrow and listen to this instead. (Turn the volume back up, of course, for the Commanders-Cowboys game.)

Sponsored by:

  • Memberful: Monetize your passion with membership. Start your free trial today.
  • AdBlock Pro: Block ads in Safari on iPhone, iPad, and Mac.
  • Squarespace: Use code talkshow to save 10% off your first purchase of a website or domain.
Former Designer of Google Maps on Its Updated Design 

Elizabeth Laraki, in an article-length post on Twitter/X

15 years ago, I helped design Google Maps. I still use it everyday. Last week, the team dramatically changed the map’s visual design. I don’t love it. It feels colder, less accurate and less human. But more importantly, they missed a key opportunity to simplify and scale. [...]

So much stuff has accumulated on top of the map. Currently there are ~11 different elements obscuring it:

  • Search box
  • 8 pills overlayed in 4 rows
  • A peeking card for “latest in the area”
  • A bottom nav bar

This is a very long way of saying that Google Maps’s app design should be like Apple Maps. In fact, Apple Maps has fewer UI elements obtruding actual map content than she’s proposing for Google Maps.

The OpenAI Coup Saga Seemingly Ends, With Sam Altman Returning as CEO 

Nilay Patel and Alex Heath, reporting for The Verge:

Sam Altman will return as CEO of OpenAI, overcoming an attempted boardroom coup that sent the company into chaos over the past several days. Former president Greg Brockman, who quit in protest of Altman’s firing, will return as well.

The company said in a statement late Tuesday that it has an “agreement in principle” for Altman to return alongside a new board composed of Bret Taylor, Larry Summers, and Adam D’Angelo. D’Angelo is a holdover from the previous board that initially fired Altman on Friday. He remains on this new board to give the previous board some representation, we’re told.

People familiar with the negotiations say that the main job of this small initial board is to vet and appoint an expanded board of up to nine people that will reset the governance of OpenAI. Microsoft, which has committed to investing billions in the company, wants to have a seat on that expanded board, as does Altman himself.

The question I’ve focused on from the start of this soap opera is who really controls OpenAI? The board thought it was them. It wasn’t. Matt Levine had the funniest-because-it’s-true take in his Money Stuff column — I don’t want to spoil it, just go there and look at his “slightly annotated” version of OpenAI’s diagram of their corporate structure.

See also: The Wall Street Journal’s compelling story of the drama behind the scenes (News+ link).


More on Sam Altman’s Ouster From OpenAI

Kevin Roose, reporting for The New York Times:

An all-hands meeting for OpenAI employees on Friday afternoon didn’t reveal much more. Ilya Sutskever, the company’s chief scientist and a member of its board, defended the ouster, according to a person briefed on his remarks. He dismissed employees’ suggestions that pushing Mr. Altman out amounted to a “hostile takeover” and claimed it was necessary to protect OpenAI’s mission of making artificial intelligence beneficial to humanity, the person said.

Mr. Altman appears to have been blindsided, too. He recorded an interview for the podcast I co-host, “Hard Fork,” on Wednesday, two days before his firing. During our chat, he betrayed no hint that anything was amiss, and he talked at length about the success of ChatGPT, his plans for OpenAI and his views on A.I.’s future.

Mr. Altman stayed mum about the precise circumstances of his departure on Friday. But Greg Brockman — OpenAI’s co-founder and president, who quit on Friday in solidarity with Mr. Altman — released a statement saying that both of them were “shocked and saddened by what the board did today.” Mr. Altman was asked to join a video meeting with the board at noon on Friday and was immediately fired, Mr. Brockman said.

Kara Swisher was all over the story last night, writing on Twitter/X:

Sources tell me that the profit direction of the company under Altman and the speed of development, which could be seen as too risky, and the nonprofit side dedicated to more safety and caution were at odds. One person on the Sam side called it a “coup,” while another said it was the the right move. [...]

More: The board members who voted against Altman felt he was manipulative and headstrong and wanted to do what he wanted to do. That sounds like a typical SV CEO to me, but this might not be a typical SV company. They certainly have a lot of explaining to do.

According to Brockman — who until he quit in protest of Altman’s firing was chairman of the OpenAI board — he didn’t find out until just 5 minutes before Altman was sacked. I’ve never once heard of a corporate board firing the company’s CEO behind the back of the chairman of the board.

It really does look more and more like a deep philosophical fissure inside OpenAI, between those led by Sutskever (and, obviously, a majority of the board) advocating a cautious slow and genuinely non-profit-driven approach, and Altman/Brockman’s “let’s move fast, change the world, and make a lot of money” side. Sutskever and the OpenAI board seemingly see Altman/Brockman as reckless swashbucklers; Altman and Brockman, I suspect, see Sutskever and his side as a bunch of ninnies.

A simple way to look at it is to read OpenAI’s charter, “the principles we use to execute on OpenAI’s mission”. It’s a mere 423 words, and very plainly written. It doesn’t sound anything at all like the company Altman has been running. The board, it appears, believes in the charter. How in the world it took them until now to realize Altman was leading OpenAI in directions completely contrary to their charter is beyond me.

It’s like the police chief in Casablanca being “shocked — shocked!” to find out that gambling was taking place in a casino where he played


Qualcomm’s Awkward Boasting Regarding Its Forthcoming X Elite Platform

Monica Chin, reporting for The Verge last month, “Qualcomm Claims Its Snapdragon X Elite Processor Will Beat Apple, Intel, and AMD”:

Qualcomm has announced its new Snapdragon X Elite platform, which looks to be its most powerful computing processor to date. The chips (including the new Qualcomm Oryon, announced today) are built on a 4nm process and include 136GB/s of memory bandwidth. PCs are expected to ship in mid-2024. [...]

Oh, Qualcomm also claims that its chip will deliver “50% faster peak multi-thread performance” than Apple’s M2 chip. This is just a funny claim; the X Elite has 50 percent more cores than the M2 and sucks down much more power, so of course it is going to do better on Geekbench at “peak multi-thread performance.” That’s like a professional sprinter bragging about winning the 100-meter dash against a bunch of marathon champions.

This news is so old that Chin is no longer on the staff at The Verge (which I think explains why she didn’t write either of their reviews for the new M3 MacBook Pros), but I’m cleaning up old tabs and wanted to comment on this.

It’s nonsense. Chips that aren’t slated to appear in any actual laptops until “mid-2024” are being compared to the M2, which Apple debuted with the MacBook Air in June 2022. So even if Qualcomm’s performance claims are true and PCs based on their chips ship on schedule, they’re comparing against a chip that Apple debuted two entire years earlier.

Plus they’re only comparing multi-core performance against the base M2. And they’re not really comparing multi-core performance overall but “peak” performance, however it is they define that. And the fact that they only mention multi-core performance strongly suggests that they’re slower than the M2 at single-core performance, which for most consumer/prosumer use cases is more important.

And: No one in the PC world seems to care about ARM chips, at least for laptops. Microsoft made a go of it with their Surface line and largely gave up. My understanding is that fewer than 1 percent of PC sales today are ARM-based machines. If Microsoft wasn’t willing to optimize Windows to make it ARM-first, or even treat ARM as an equal to x86, when they themselves were trying to make ARM-based Windows laptops a thing, why would they do it now?

If Mac hardware and MacOS were made by separate companies, and the MacOS software company licensed their OS to other OEMs, I really don’t think Apple silicon hardware would have happened. The seemingly too-good-to-be-true performance of Apple silicon Macs is the result of the silicon being designed for the software and the software being optimized and at very low levels designed for the silicon. Qualcomm isn’t going to get that from Microsoft with Windows.

Qualcomm’s X Elite platform may well beat Intel and AMD, but I’m not sure that will matter in the PC world unless Microsoft truly goes all-in on ARM with Windows. Which I don’t see happening. But the idea that they’re even vaguely catching up to Apple silicon is laughable, and it’s frustrating that so much of the tech press took anything Qualcomm claimed about relative performance against Apple silicon seriously.

We know for a fact that their Snapdragon chips for phones have always lagged years behind Apple’s A-series chips in both sheer performance and performance-per-watt, with no sign that they’re catching up. So how in the world would their ARM chips for PCs beat Apple’s M-series chips?

And, yes, I predicted this back in November 2021, when Qualcomm claimed they’d be shipping “M-series competitive” chips for PCs by 2023. Qualcomm claimed to still be on track to ship in 2023 just one year ago, so I wouldn’t hold my breath for “mid-2024” either. 


Vision Pro, Spatial Video, and Panoramic Photos

Yesterday Apple released developer beta 2 of iOS 17.2, the first version of iOS to include support for capturing spatial video with iPhone 15 Pro models. Today came the public beta, enabling the same feature. Apple invited me to New York yesterday, not merely to preview capturing spatial video using an iPhone, but to experience watching those spatial videos using a Vision Pro.

The experience was, like my first Vision Pro demo back at WWDC in June, astonishing.

Capturing Spatial Video on an iPhone

Shooting spatial video on an iPhone 15 Pro is easy. The feature is — for now at least — disabled by default, and can be enabled in Settings → Camera → Formats. The option is labeled “Spatial Video for Apple Vision Pro”, which is apt, because (again, for now at least) spatial video only looks different from non-spatial video when playing it back on Vision Pro. When viewed on any other device it just looks like a regular flat video.

Once enabled, you can toggle spatial video capture in the Camera app whenever you’re in the regular Video mode. It’s very much akin to the toggle for Live Photos when taking still photos, or the Action mode toggle for video — not a separate mode in the main horizontal mode switcher in Camera, but a toggle button in the main Video mode.

When capturing spatial video, you have no choice regarding resolution, frame rate, or file format. All spatial video is captured at 1080p, 30 fps, in the HEVC file format.1 You also need to hold the phone horizontally, to put the two capturing lenses on the same plane. The iPhone uses the main (1×) and ultra wide (0.5×) lenses for capture when shooting spatial video, and in fact, Apple changed the arrangement of the three lenses on the iPhone 15 Pro in part to support this feature. (On previous iPhone Pro models, when held horizontally, the ultra wide camera was on the bottom, and the main and telephoto lenses were next to each other on the top.)

I believe resolution is limited to 1080p because to get an image from the ultra wide 0.5× camera that is the equivalent field of view as from the main 1× camera, it needs to crop the ultra wide image significantly. The ultra wide camera has a mere 12 MP sensor, so there just aren’t enough pixels to crop a 1× equivalent field of view from the center of the sensor and get a 4K image.

There are two downsides to shooting spatial video with your iPhone. First, the aforementioned 1080p resolution and 30 fps frame rate. I’ve been shooting 4K video by default for years, because why not? I wish we could capture spatial video at 4K, but alas, not yet. The second downside to shooting spatial video is that it effectively doubles the file size compared to non-spatial 1080p, for the obvious reason that each spatial video contains two 1080p video streams. That file-size doubling is a small price to pay — the videos are still smaller than non-spatial 4K 30 fps video.2

Really, it’s just no big deal to capture spatial video on your iPhone. If the toggle button is off, you capture regular video with all the regular options for resolution (720p/1080p/4K) and frame rates (24/30/60). If the toggle for spatial video is on — and when on it’s yellow, impossible to miss — you lose those choices but it just looks like capturing a regular video. And when you play it back, or share it with others who are viewing it on regular devices like their phones or computers, it just looks like a regular flat video.

If you own an iPhone 15 Pro, there’s no good reason not to start capturing spatial videos this year — like, say, this holiday season — to record any sort of moments that feel like something you might want to experience as “memories” with a Vision headset in the future, even if you don’t plan to buy the first-generation Vision Pro next year.

Viewing Spatial Video on Vision Pro

Before my demo, I provided Apple with my eyeglasses prescription, and the Vision Pro headset I used had appropriate corrective lenses in place. As with my demo back in June, everything I saw through the headset looked incredibly sharp.

Apple has improved and streamlined the onboarding/calibration process significantly since June. There are a few steps where you’re presented with a series of dots in a big circle floating in front of you, like the hour indexes on a clock. As you look at each circle, it lights up a bit, and you do the finger tap gesture. It’s the Vision Pro’s way to calibrate that what it thinks you’re looking is what you actually are looking at. Once that calibration step was over — and it took just a minute or two — I was in, ready to go on the home screen of VisionOS. (And the precision of this calibration is amazing — UI elements can be placed relatively close to each other and it knows exactly which one you’re looking at when you tap. iOS UI design needs to be much more forgiving of our relatively fat fingertips than VisionOS UI design needs to be about the precision of our gaze.)

My demo yesterday was expressly limited to photography in general, and spatial video in particular, and so my demo was, per Apple’s request, limited to the Photos app in VisionOS. It was tempting, at times, to see where else I could go and what else I could do. But there was so much to see and do in Photos alone that my demo — about 30 minutes in total wearing Vision Pro — raced by.

Prior to separating us in smaller rooms for our time using Vision Pro, I was paired with Joanna Stern from The Wall Street Journal for a briefing on the details of spatial video capturing using the iPhones 15 Pro. We were each provided with a demo iPhone, and were allowed to capture our own spatial videos. We were in a spacious modern high-ceiling’d kitchen, bathed in natural light from large windows. A chef was busy preparing forms of sushi, and served as a model for us to shoot. Joanna and I also, of course, shot footage of each other, while we shot each other. It was very meta. The footage we captured ourselves was then preloaded onto the respective Vision Pro headsets we used for our demos.

We were not permitted to capture spatial video on Vision Pro.3 However, our demo units had one video in the Photos library that was captured on Vision Pro — a video I had experienced before, back in June, of a group of twenty-somethings sitting around a fire pit at night, having fun in a chill atmosphere. There were also several other shot-by-Apple spatial videos which were captured using an iPhone 15 Pro.

One obvious question: How different do spatial videos captured using iPhone 15 Pro look from those captured using a Vision Pro itself? Given that Apple provided only one example spatial video captured on Vision Pro, I don’t feel like I can fully answer that based on my experience yesterday. It did not seem like the differences were dramatic or significant. The spatial videos shot using iPhone 15 Pro that I experienced, including those I captured myself, seemed every bit as remarkable as the one captured using Vision Pro.

Apple won’t come right out and say it but I do get the feeling that all things considered, spatial video captured using Vision Pro will be “better”. The iPhone might win out on image quality, given the fact that the 1× main camera on the iPhone 15 Pro is the single best camera system Apple makes, but the Vision Pro should win out on spatiality — 3D-ness — because Vision Pro’s two lenses for spatial video capture are roughly as far apart from each other as human eyes. The two lenses used for capture on an iPhone are, of course, much closer to each other than any pair of human eyes. But despite how close the two lenses are to each other, the 3D effect is very compelling on spatial video captured on an iPhone. It’s somehow simultaneously very natural-looking and utterly uncanny.

It’s a stunning effect and remarkable experience to watch them. And so the iPhone, overall, is going to win out as the “best” capture device for spatial video — even if footage captured on Vision Pro is technically superior — because, as the old adage states, the best camera is the one you have with you. I have my iPhone with me almost everywhere. That will never be even close to true for Vision Pro next year.

Here’s what I wrote about spatial video back in June, after my first hands-on time with Vision Pro:

Spatial photos and videos — photos and videos shot with the Vision Pro itself — are viewed as a sort of hybrid between 2D content and fully immersive 3D content. They don’t appear in a crisply defined rectangle. Rather, they appear with a hazy dream-like border around them. Like some sort of teleportation magic spell in a Harry Potter movie or something. The effect reminded me very much of Steven Spielberg’s Minority Report, in the way that Tom Cruise’s character could obsessively watch “memories” of his son, and the way the psychic “precogs” perceive their visions of murders about to occur. It’s like watching a dream, but through a portal opened into another world.

When you watch regular (non-spatial) videos using Vision Pro, or view regular still photography, the image appears in a crisply defined window in front of you. Spatial videos don’t appear like that at all. I can’t describe it any better today than I did in June: it’s like watching — and listening to — a dream, through a hazy-bordered portal opened into another world.

Several factors contribute to that dream-like feel. Spatial videos don’t look real. It doesn’t look or feel like the subjects are truly there in front of you. That is true of the live pass-through video you see in Vision Pro, of the actual real world around you. That pass-through video of actual reality is so compelling, so realistic, that in both my demo experiences to date I forgot that I was always looking at video on screens in front of my eyes, not just looking through a pair of goggles with my eyes’ own view of the world around me.

So Vision Pro is capable of presenting video that looks utterly real — because that’s exactly how pass-through video works and feels. Recorded spatial videos are different. For one thing, reality is not 30 fps, nor is it only 1080p. This makes spatial videos not look low-resolution or crude, per se, but rather more like movies. The upscaled 1080p imagery comes across as film-like grain, and the obviously-lower-than-reality frame rate conveys a movie-like feel as well. Higher resolution would look better, sure, but I’m not sure a higher frame rate would. Part of the magic of movies and TV is that 24 and 30 fps footage has a dream-like aspect to it.

Nothing you’ve ever viewed on a screen, however, can prepare you for the experience of watching these spatial videos, especially the ones you will have shot yourself, of your own family and friends. They truly are more like memories than videos. The spatial videos I experienced yesterday that were shot by Apple looked better — framed by professional photographers, and featuring professional actors. But the ones I shot myself were more compelling, and took my breath away. There’s my friend, Joanna, right in front of me — like I could reach out and touch her — but that was 30 minutes ago, in a different room.

Prepare to be moved, emotionally, when you experience this.

Panoramic and Still Photos

My briefing and demo experience yesterday was primarily about capturing spatial video on iPhone 15 Pro and watching it on Vision Pro, but my demo went through the entire Photos app experience in VisionOS.

Plain old still photos look amazing. You can resize the virtual window in which you’re viewing photos to as large as you can practically desire. It’s not merely like having a 20-foot display — a size far more akin to that of a movie theater screen than a television. It’s like having a 20-foot display with retina quality resolution, and the best brightness and clarity of any display you’ve ever used. I spend so much time looking at my own iPhone-captured still photos on my iPhone display that it’s hard to believe how good they can look blown up to billboard-like dimensions. Just plain still photos, captured using an iPhone.

And then there are panoramic photos. Apple first introduced Pano mode back in 2012, with the introduction of the iPhone 5. That feature has never struck me as better than “Kind of a cool trick”. In the decade since the feature has been available, I’ve only taken about 200 of them. They just look too unnatural, too optically distorted, when viewed on a flat display. And the more panoramic you make them, the more unnatural they look when viewed flat.

Panoramic photos viewed using Vision Pro are breathtaking.

There is no optical distortion at all, no fish-eye look. It just looks like you’re standing at the place where the panoramic photo was taken — and the wider the panoramic view at capture, the more compelling the playback experience is. It’s incredible, and now I wish I’d spent the last 10 years taking way more of them.

As a basic rule, going forward, I plan to capture spatial videos of people, especially my family and dearest friends, and panoramic photos of places I visit. It’s like teleportation.

Miscellaneous Observations

The Vision Pro experience is highly dependent upon foveated rendering, which Wikipedia succinctly describes as “a rendering technique which uses an eye tracker integrated with a virtual reality headset to reduce the rendering workload by greatly reducing the image quality in the peripheral vision (outside of the zone gazed by the fovea).” Our retinas work like this too — we really only see crisply what falls on the maculas at the center of our retinas. Vision Pro really only renders at high resolution what we are directly staring at. The rest is lower-resolution, but that’s not a problem, because when you shift your gaze, Vision Pro is extraordinarily fast at updating the display.

I noticed yesterday that if I darted my eyes from one side to the other fast enough, I could sort of catch it updating the foveation. Just for the briefest of moments, you can catch something at less than perfect resolution. I think. It is so fast fast fast at tracking your gaze and updating the displays that I can’t be sure. It’s just incredible, though, how detailed and high resolution the overall effect is. My demo yesterday was limited to the Photos app, but I came away more confident than ever that Vision Pro is going to be a great device for reading and writing — and thus, well, work.

The sound quality of the speakers in the headset strap is impressive. The visual experience of Vision Pro is so striking — I mean, the product has “Vision” in its name for a reason — that the audio experience is easy to overlook, but it’s remarkably good.

Navigating VisionOS with your gaze and finger taps is so natural. I’ve spent a grand total of about an hour, spread across two 30-minute demos, using Vision Pro, but I already feel at home using the OS. It’s an incredibly natural interaction model based simply on what you are looking at. My enthusiasm for this platform, and the future of spatial computing, could not be higher. 


  1. The HEVC spec allows for a single file to contain multiple video streams. That’s what Apple is doing, with metadata describing which stream is “left” and which is “right”. Apple released preliminary documentation for this format back in June, just after WWDC. ↩︎

  2. According to Apple, these are the average file sizes per minute of video:

    • Regular 1080p 30 fps: 65 MB
    • Spatial 1080p 30 fps: 130 MB
    • Regular 4K 30 fps: 190 MB ↩︎︎

  3. Paraphrased:

    “This is the digital crown. You’ll be using this today to adjust the immersiveness by turning it, and you’ll press the crown if you need to go back to the home screen. On the other side is the button you would use to capture photos or videos using Vision Pro. We won’t be using that button today.”

    “But does that button work? If I did press that button, would it capture a photo or video?”

    “Please don’t press that button.” ↩︎︎