By John Gruber
Obsidian: the private and flexible writing app that adapts to the way you think. Sign up by Jan 1st for a special offer.
Carl Willis, reporting for ABC 7News in Washington D.C.:
“As soon as he parked the car two masked gentlemen came up to him, armed,” she said. “They robbed him, took everything he had in his pockets, took the keys to my truck and got in and pulled off.”
She said one of them approached on foot in the 2400 block of 14th Street, NW. The other was in a black BMW, both of them armed with guns. She said the robbers were bold taking her husband’s phone, but then giving it back because it wasn’t to their liking.
“They basically looked at that phone and was like ‘Oh, that’s an Android? We don’t want this. I thought it was an iPhone,’” she said.
Leave the Android, take the cannoli.
The Impassioned Moderate, a year ago:
News came out a few weeks ago that Bending Spoons, a consumer app studio, raised a massive $340 million round of financing. The press gushed about it: “Hollywood star, tech execs invest in Italian start-up Bending Spoons”, “Ryan Reynolds invests in ‘terrifying’ Italian start-up Bending”. And Ryan himself said things that are just so easy to imagine him saying (a testament to the spectacular job he’s done branding himself): “Their apps enable anyone to become a creative genius with minimum effort. In fact, their products terrify me so much, I had to invest.” (Ironically - or not? - his ad agency is called Maximum Effort…)
The problem? Bending Spoons is the one the most predatory actors on the entire App Store - they’re terrifying in a completely different way.
Bending Spoons’s business model is to buy successful apps, change them to a weekly auto-renewing subscription model that perhaps tricks users into signing up, and using the revenue to buy more apps and repeat the cycle. Filmic, for example, now defaults to a $3/week subscription — over $150/year. To be fair, there’s also a $40/year subscription.
It doesn’t seem like a scam, per se, but it doesn’t seem like a product-driven company. Apps seemingly don’t thrive after acquisition by Bending Spoons — instead, they get bled dry. There are some apps where a weekly subscription makes sense — Flighty comes to mind, for occasional travelers — but a camera app? Feels deceptive.
Bending Spoons is a big company with a lot of revenue and that spends a lot of money on App Store and Play Store search ads. (Here’s Tim Cook visiting their office last year.)
The timing is surely coincidental with regard to the news about Filmic, but, as they say, fortune favors the prepared.
Jaron Schneider, reporting for PetaPixel:
Filmic, or FiLMiC as written by the brand, no longer has any dedicated staff as parent company Bending Spoons has laid off the entire team including the company’s founder and CEO, PetaPixel has learned. Considered for years as the best video capture application for mobile devices, the team behind Filmic Pro and presumably Filmic Firstlight — the company’s photo-focused app — has been let go. [...]
It is unclear what Bending Spoons intends to do with Filmic Pro or Filmic Firstlight, but there were early signs of trouble when the company’s most recent major update was last year. The most recent notable update to Filmic Pro came in October which brought support for Apple Log into the app, but there was no mention of the addition of external SSD support, odd considering that Filmic Pro had a strong track record for updating its platform to work with all of the new iPhone updates — especially those that are particularly important for video.
In Filmic’s absence, Blackmagic Design’s iOS app has become the most popular way to capture footage with the new iPhones and was used by Apple’s in-house team for the production of its Mac event on October 31.
Hate this but I’m sadly not at all surprised. Filmic has an incredible product they were afraid to charge for and when they finally changed pricing models, it was too little too late and users rebelled. If they had been charging $100 a year or even upfront in 2015, I think they could have survived without selling to the Bending Spoons vultures. But now they’ve got a subscription app that isn’t actively improving and free competition from Black Magic who uses their apps as loss leaders. Hate it.
Filmic was featured by Apple in numerous iPhone keynotes and App Store promotions over the years — for a long stretch it was undeniably the premier “pro” video camera app for iPhones.
Aditya Kalra and Munsif Vengattil, reporting for Reuters from New Delhi:
India wants to implement a European Union rule that will require smartphones to have a universal USB-C charging port, and has been in talks with manufacturers about introducing the requirement in India by June 2025, six months after the deadline in the EU. While all manufacturers including Samsung have agreed to India’s plan, Apple is pushing back. [...]
In a closed-door Nov. 28 meeting chaired by India’s IT ministry, Apple asked officials to exempt existing iPhone models from the rules, warning it will otherwise struggle to meet production targets set under India’s production-linked incentive (PLI) scheme, according to the meeting minutes seen by Reuters. [...]
In terms of market share, Apple accounts for 6% of India’s booming smartphone market, compared with just about 2% four years ago. Apple suppliers have expanded their facilities and make most iPhone 12, 13, 14 and 15 models in India for local sales and exports, Counterpoint Research estimates. Only iPhone 15 has the new universal charging port. Apple told Indian officials in the meeting that the “design of the earlier products cannot be changed,” the document showed.
Consumers in India’s price-conscious market prefer buying older models of iPhones which typically become cheaper with new launches, and India’s push for the common charger on older models could hit Apple’s targets, said Prabhu Ram, head of the Industry Intelligence Group at CyberMedia Research. “Apple’s fortunes in India have primarily been tied to older generation iPhones,” he said.
I was under the impression that the EU’s USB-C requirement will only apply to new devices, but maybe not? A plain reading of this EU press release suggests that all phones sold, starting in 2025, must have USB-C charging ports:
By the end of 2024, all mobile phones, tablets and cameras sold in the EU will have to be equipped with a USB Type-C charging port. From spring 2026, the obligation will extend to laptops.
That would mean, starting in January 2025, that the only iPhones available in the EU will be this year’s iPhones 15 and next year’s iPhones 16. A new fourth-generation iPhone SE with USB-C would give Apple a much-needed lower-priced model. The second-gen SE came in 2020; the current third-gen SE in 2022.
See also: Ben Lovejoy at 9to5Mac.
I finally got around to scratching a longstanding itch. I’m an inveterate web browser tab hoarder, and a scenario I frequently encounter is wanting to move the most recent (typically, rightmost) tabs into a new window all by themselves. Let’s say, for example, I have 26 tabs open in the frontmost Safari window, A through Z. The current selected tab is X. This script will move tabs X, Y, and Z to a new window, leaving tabs A through W open in the old window. It starts with the current tab, and moves that tab and those to the right.
I have the script saved in my FastScripts scripts folder for Safari, but I tend to invoke it from LaunchBar (which I have configured to index my entire scripts folder hierarchy). Command-Space to bring up LaunchBar, type “spl” to select this script, hit Return, done.
I have no idea how many others might want this, but in recent years here at DF I’ve gotten away from sharing my occasional scripting hacks, and feel like I ought to get back to sharing them. Can’t let Dr. Drang have all the fun.
Update: Leon Cowle adapted my script to be more elegant and concise. If you’re using this but grabbed the script before 10:30pm ET, go back and re-grab it.
Back in August I ran a poll on Mastodon, asking my followers if they have iCloud Advanced Data Protection enabled. iCloud Advanced Data Protection was announced two years ago this week, alongside support for security keys (e.g. Yubico). The results, from 2,304 responses:
Count me in that last group. I’ve got a handful of old devices that I still use which can’t be updated to an OS version that supports the feature. But one of these days I’ll just sign out of iCloud on those devices and enable this.
As ever when I run polls like this, it should go without saying that the Daring Fireball audience is not representative of the general public. The results for this poll — with nearly 30 percent of responders having an esoteric security feature enabled — exemplify that.
One of Apple’s latest accessibility features is Personal Voice — for people who are “at risk of voice loss or have a condition that can progressively impact your voice”, Personal Voice lets you create a voice that sounds like you.
The Lost Voice is a two-minute short film directed by Taika Waititi celebrating this feature. It’s a splendid, heartwarming film, and it’s especially remarkable to see so much effort, such remarkable production values and filmmaking talent, being applied to marketing a feature for a tiny fraction of Apple’s users. Most people do not need this feature. But for those who do, it seems life-altering. Genuinely profound.
Apple at its very best.
See also: Shelly Brisbin at Six Colors.
Three thoughts:
I did not expect to hear a Tom Petty song in a GTA trailer, but I love it. It works. (Hard to escape the feeling though that the Petty estate is willing to sell songs in ways Petty himself wouldn’t have.)
The game looks amazing.
“Coming 2025”! Holy smokes, this game has been in development for a decade. (GTA 5 came out in late 2013 and has sold 190 million copies and generated over $8 billion.)
You’ve probably seen Infinite Mac, the web-based emulator of classic Mac OS, before. But Software Inc. — a new company from some of the people behind Workflow, which became Shortcuts after acquisition by Apple — used it to create their company website, and it’s delightful.
My thanks to Kolide for sponsoring last week at DF. Getting OS updates installed on end user devices should be easy. After all, it’s one of the simplest yet most impactful ways that every employee can practice good security. On top of that, every MDM solution promises that it will automate the process and install updates with no user interaction needed. Yet in the real world, it doesn’t play out like that. Users don’t install updates and IT admins won’t force installs via forced restart.
With Kolide, when a user’s device — be it Mac, Windows, Linux, or mobile — is out of compliance, Kolide reaches out to them with instructions on how to fix it. The user chooses when to restart, but if they don’t fix the problem by a predetermined deadline, they’re unable to authenticate with Okta.
Watch Kolide’s on-demand demo to learn more about how it enforces device compliance for companies with Okta.
Nilay Patel returns to the show. Topics include the iPhones 15, journalism in the age of AI, and what it’s like to have Barack Obama on your podcast.
Sponsored by:
Faruk Korkmaz posits a seemingly likely explanation for that “computational photography glitch in a bridal shop” photo: it was taken in Panoramic mode. The subject claims it wasn’t a Panoramic mode photo, but she didn’t snap the photo, and if a photo taken in Panoramic mode isn’t wide enough to reach some threshold, the Photos app does not identify/badge it as such. And conversely, a normal photograph cropped to a very wide aspect ratio will be badged as Panoramic — like this and this from my own library — even though it wasn’t snapped in Panoramic mode.
I think it’s quite likely Korkmaz is correct that this is the explanation for how this photo was created; I remain unconvinced that it wasn’t a deliberate publicity stunt.
This is just an astonishing 20-minute film by Hiroshi Sumi. An homage and loving look back at the earliest days of Industrial Light and Magic. I don’t want to say much more than that lest I spoil the wonder of it. I don’t know why anyone would exert so much effort to make something like this but I’m so inordinately delighted that Sumi did. It speaks to the power of obsession.
After you watch it, take a look at this tweet from Sumi, and this prototype rendering from three years ago.
Just amazing. So much obvious love. (Via Todd Vaziri.)
Katie Tarasov, CNBC:
In November, CNBC visited Apple’s campus in Cupertino, California, the first journalists allowed to film inside one of the company’s chip labs. We got a rare chance to talk with the head of Apple silicon, Johny Srouji, about the company’s push into the complex business of custom semiconductor development, which is also being pursued by Amazon.
“We have thousands of engineers,” Srouji said. “But if you look at the portfolio of chips we do: very lean, actually. Very efficient.”
Can’t say there’s any news in this, but it’s neat to see inside the chip-testing lab. (Same video is available on YouTube, too, if that’s your jam.)
Luke Bouma, writing for Cord Cutters:
Today, Cord Cutters News has confirmed that Amazon is adding full-screen video ads that will play when you start your Fire TV unless you quickly perform an action on it.
This new update will be rolling out to all Fire TVs made in 2016 or newer. With this update, the ad at the top of your Fire TV will now start playing full-screen, often promoting a movie or TV show. By hitting the home button, you can quickly exit the ad or if you quickly perform an action on the Fire TV once it finishes, you will avoid the video ad, but you only have a few seconds.
“Our focus is on delivering an immersive experience so customers can enjoy their favorite TV shows and movies, as well as browse and discover more content they’ll want to watch. We’re always working to make the Fire TV experience better for customers and have updated one of the prominent placements in the UI to play a short content preview if no other action is taken by a customer upon turning on their Fire TV.” Amazon said in a statement to Cord Cutters News.
What a load of horseshit from Amazon in that statement. Autoplaying ads aren’t “immersive”. And this is in no way “working to make the Fire TV experience better for customers”. Working to make things better would mean getting rid of shit like this, not adding it.
I really don’t understand how anyone uses anything but an Apple TV box. Apple TV is far from perfect but holy hell, it really does start from the perspective of respecting you, the user. The people at Apple who make it are obviously trying to create the experience that they themselves want when they’re watching TV at home.
Wesley Hillard, self-described “Rumor Expert”, writing at AppleInsider:
A U.K. comedian and actor named Tessa Coates was trying on wedding dresses when a shocking photo of her was taken, according to her Instagram post shared by PetaPixel. The photo shows Coates in a dress in front of two mirrors, but each of the three versions of her had a different pose.
One mirror showed her with her arms down, the other mirror showed her hands joined at her waist, and her real self was standing with her left arm at her side. To anyone who doesn’t know better, this could prove to be quite a shocking image.
To the contrary, to anyone who “knows better”, this image clearly seems fake. But it’s a viral sensation:
Coates, in her Instagram description, claims “This is a real photo, not photoshopped, not a pano, not a Live Photo”, but I’m willing to say she’s either lying or wrong about how the photo was taken. Doing so feels slightly uncomfortable, given that the post was meant to celebrate her engagement, but I just don’t buy it. These are three entirely different arm poses, not three moments in time fractions of a second apart — and all three poses in the image are perfectly sharp. iPhone photography just doesn’t work in a way that would produce this image. I’d feel less certain this was a fake if there were motion blur in the arms in the mirrors. You can get very weird-looking photos from an iPhone’s Pano mode, but again, Coates states this is not a Pano mode image. (Perhaps you can generate an image like this using a Google Pixel 8’s Best Take feature, but this is purportedly from an iPhone, which doesn’t have a feature like that. And even with Best Take, that’s a feature you invoke manually, using multiple original images as input. I don’t think any phone camera, let alone an iPhone, produces single still images such as this.)
In a thread on Threads, where several commenters are rightfully skeptical:
Tyler Stalman (who hosts a great podcast on photography and videography):
Any iPhone photographer can confirm that this is not an image processing error, it would never look like this.
David Imel (a writer/researcher for MKBHD):
I really, REALLY do not think this is a real image. HDR on phones takes 5-7 frames with split-second exposure times. Whole process like .05 sec. Even a live photo is < 2 seconds.
Even if the phone thought they were diff people it wouldn’t stitch like this and wouldn’t have time.
This is spreading everywhere and it’s driving me insane.
I challenge anyone who thinks this is legit to produce such an image using an iPhone with even a single mirror in the scene, let alone two. If I’m wrong, let me know.
Update 1: Claude Zeins takes me up on my challenge.
Update 2: In a long-winded story post, Coates says she went to an Apple Store for an explanation and was told by Roger, the “grand high wizard” of Geniuses at the store, that Apple is “beta testing” a feature like Google’s Best Take. Which is not something Apple does, and if they did do, would require her to have knowingly installed an iOS beta.
Update 3: Best theory to date: it was, despite Coates’s claim to the contrary, taken in Panoramic mode.
Jason Snell:
Castro has been a popular iOS podcast app for many years, but right now things look grim.
The cloud database that backs the service is broken and needs to be replaced. As a result, the app has broken. (You can’t even export subscriptions out of it, because even that function apparently relies on the cloud database.) “The team is in the progress of setting up a database replacement, which might take some time. We aim to have this completed ASAP,” said an Xtweet from @CastroPodcasts.
What’s worse, according to former Castro team member Mohit Mamoria, “Castro is being shut down over the next two months.”
I always appreciated Castro — it’s a well-designed, well-made app that embraced iOS design idioms. But as a user it just never quite fit my mental model for how a podcast client should work, in the way that Overcast does. I wanted to like Castro more than I actually liked it.
As a publisher, Castro was the 4th or 5th most popular client for The Talk Show for a while, but in recent years has slipped. Right now it’s 10th — but in a logarithmic curve. Overcast remains 1st; Apple Podcasts 2nd. The truth is, if not for Overcast, Castro would likely be in that top position, not shutting down. But Overcast does exist, and it’s the app where most people with exquisite taste in UI are listening to podcasts. There aren’t many markets where listeners of The Talk Show are in the core demographic, but iOS podcast apps are one. I can’t say why or precisely when, but somewhere along the line Castro lost its mojo.
I salute everyone who’s worked on it, though, because it really is a splendid app.
Jason Snell, writing at Six Colors:
Last month I wrote about how Apple’s cascade of macOS alerts and warnings ruin the Mac upgrade experience. [...]
This issue was brought home to me last week when I was reviewing the M3 iMac and the M3 MacBook Pro. As a part of reviewing those computers, I used Migration Assistant to move a backup of my Mac Studio to the new systems via a USB drive. Sometimes I try to review a computer with nothing migrated over, but it can be a real slowdown and I didn’t really have any time to spare last week.
Anyway, by migrating, I got to (twice) experience Apple’s ideal process of moving every user from one Mac to the next. You start up your new computer, migrate from a backup of the old computer, and then start using the new one. There’s a lot that’s great about this process, and it’s so much better than what we used to have to do to move files over from one Mac to another.
And yet all of Apple’s security alerts got in the way again and spoiled the whole thing. Here’s a screenshot I took right after my new Mac booted for the first time after migration.
I went through the exact same thing. Except if I had taken a screenshot of all the security-permission alerts I had to go though, there would have been more of them — and Snell’s screenshot looks like a parody. Back in the heyday of the “Get a Mac” TV ad campaign, Apple justifiably lambasted Windows Vista for its security prompts, but that’s exactly the experience you get after running Migration Assistant on a Mac today. It’s terrible.
Don’t get me wrong: Migration Assistant is borderline miraculous. It’s a wonderful tool that seemingly just keeps getting better. But MacOS itself stores too many security/privacy settings in a way that are tied to the device, not your user account. There ought to be some way to OK all these things in one fell swoop.
As Snell says, setting up a new Mac should be a joy, not a chore. Migration Assistant takes care of so much, but these cursed security prompts spoil the experience.
Kyle Melnick, reporting last week for The Washington Post under the headline “A Toddler Was Taken in a Carjacking; VW Wanted $150 for GPS Coordinates, Lawsuit Says”:
Shepherd, who was four months pregnant, tried to fight off the man. But she was thrown to the pavement and run over by her own car as the man drove away with Isaiah in the back seat, authorities said. Shepherd thought she might never see her son again.
After Shepherd frantically called 911, investigators contacted Volkswagen’s Car-Net service, which can track the location of the manufacturer’s vehicles. They hoped to locate Isaiah.
But a customer service representative said that wouldn’t be possible because Shepherd’s subscription to the satellite service had expired, according to a new lawsuit. The employee said he couldn’t help until a $150 payment was made, the complaint said.
This perfectly illustrates the perils of Apple eventually charging for Emergency SOS satellite service. If Apple someday cuts off free service for compatible iPhones, eventually there’s going to be someone who dies because they chose not to pay to continue service. No one wants that.
Apple Newsroom, two weeks ago:
One year ago today, Apple’s groundbreaking safety service Emergency SOS via satellite became available on all iPhone 14 models in the U.S. and Canada. Now also available on the iPhone 15 lineup in 16 countries and regions, this innovative technology — which enables users to text with emergency services while outside of cellular and Wi-Fi coverage — has already made a significant impact, contributing to many lives being saved. Apple today announced it is extending free access to Emergency SOS via satellite for an additional year for existing iPhone 14 users.
My hunch on this is that Apple would like to make this available free of charge in perpetuity, but wasn’t sure how much it would actually get used, and thus how much it would actually cost. If they come right out and say it’s free forever, then it needs to be free forever. It’s safer to just do what they’ve done here: make it free for an extra year one year at a time, and see how it goes as more and more iPhones that support the feature remain in active use.
It’s a wonderful feature — quite literally life-saving in numerous cases — but it’d be hard to sell. It’s like buying insurance. People like paying for stuff they want to use, not for stuff they hope they never need. Obviously, people do buy insurance — Apple itself, of course, sells AppleCare — but how many people would pay extra for Emergency SOS? If Apple can just quietly eat the cost of this service, they should, and I think will.
Andrew Ross Sorkin and Robert D. Hershey Jr., reporting for The New York Times:
Charles T. Munger, who quit a well-established law career to be Warren E. Buffett’s partner and maxim-spouting alter-ego as they transformed a foundering New England textile company into the spectacularly successful investment firm Berkshire Hathaway, died on Tuesday in Santa Barbara, Calif. He was 99.
His death, at a hospital, was announced by Berkshire Hathaway. He had a home in Los Angeles.
Although overshadowed by Mr. Buffett, who relished the spotlight, Mr. Munger, a billionaire in his own right — Forbes listed his fortune as $2.6 billion this year — had far more influence at Berkshire than his title of vice chairman suggested.
Mr. Buffett has described him as the originator of Berkshire Hathaway’s investing approach. “The blueprint he gave me was simple: Forget what you know about buying fair businesses at wonderful prices; instead, buy wonderful businesses at fair prices,” Mr. Buffett once wrote in an annual report. [...]
A $1,000 investment in Berkshire made in 1964 is worth more than $10 million today.
Mr. Munger was often viewed as the moral compass of Berkshire Hathaway, advising Mr. Buffett on personnel issues as well as investments. His hiring policy: “Trust first, ability second.”
A new edition of Munger’s book of aphorisms, Poor Charlie’s Almanack — its title an allusion to Munger’s idol, Benjamin Franklin — is due next week.
AnnaMaria Andriotis, reporting for The Wall Street Journal (News+):
Apple is pulling the plug on its credit-card partnership with Goldman Sachs, the final nail in the coffin of the Wall Street bank’s bid to expand into consumer lending.
The tech giant recently sent a proposal to Goldman to exit from the contract in the next roughly 12-to-15 months, according to people briefed on the matter. The exit would cover their entire consumer partnership, including the credit card the companies launched in 2019 and the savings account rolled out this year.
It couldn’t be learned whether Apple has already lined up a new issuer for the card.
Apple Card is a strange product — everyone I know who has one likes it (including me), but Goldman itself has reported that they’ve lost $3 billion since 2020 on it. The savings accounts are a hit with customers too.
American Express is rumored to be one possible partner, but it would be pretty strange for Apple Cards to transmogrify from MasterCard to Amex cards overnight. There are still a lot of businesses — particularly throughout Europe — that accept MasterCard but not Amex. It’s not just that Apple Card would no longer be accepted at businesses where previously it was, but that would highlight the fact that Apple Card is really just an Apple-branded card issued by a company that isn’t Apple. Apple wants you to think of Apple Card as, well, an Apple credit card.
Ian Hickson, who recently left Google after an 18-year stint:
The lack of trust in management is reflected by management no longer showing trust in the employees either, in the form of inane corporate policies. In 2004, Google’s founders famously told Wall Street “Google is not a conventional company. We do not intend to become one.” but that Google is no more.
Much of these problems with Google today stem from a lack of visionary leadership from Sundar Pichai, and his clear lack of interest in maintaining the cultural norms of early Google. A symptom of this is the spreading contingent of inept middle management. [...]
It’s definitely not too late to heal Google. It would require some shake-up at the top of the company, moving the centre of power from the CFO’s office back to someone with a clear long-term vision for how to use Google’s extensive resources to deliver value to users. I still believe there’s lots of mileage to be had from Google’s mission statement (“to organize the world’s information and make it universally accessible and useful”). Someone who wanted to lead Google into the next twenty years, maximising the good to humanity and disregarding the short-term fluctuations in stock price, could channel the skills and passion of Google into truly great achievements.
I do think the clock is ticking, though. The deterioration of Google’s culture will eventually become irreversible, because the kinds of people whom you need to act as moral compass are the same kinds of people who don’t join an organisation without a moral compass.
This jibes with my perception of Google from the outside. Early Google did two things great:
Neither of those things has been true in recent years, and the responsibility clearly falls on Pichai.
Maggie Harrison, writing for Futurism:
The only problem? Outside of Sports Illustrated, Drew Ortiz doesn’t seem to exist. He has no social media presence and no publishing history. And even more strangely, his profile photo on Sports Illustrated is for sale on a website that sells AI-generated headshots, where he’s described as “neutral white young-adult male with short brown hair and blue eyes.”
Ortiz isn’t the only AI-generated author published by Sports Illustrated, according to a person involved with the creation of the content who asked to be kept anonymous to protect them from professional repercussions. “There’s a lot,” they told us of the fake authors. “I was like, what are they? This is ridiculous. This person does not exist.”
“At the bottom [of the page] there would be a photo of a person and some fake description of them like, ‘oh, John lives in Houston, Texas. He loves yard games and hanging out with his dog, Sam.’ Stuff like that,” they continued. “It’s just crazy.”
The AI authors’ writing often sounds like it was written by an alien; one Ortiz article, for instance, warns that volleyball “can be a little tricky to get into, especially without an actual ball to practice with.”
What an incredible fall from grace for what was, for decades, a truly great magazine. I can see how they thought they’d get away with it, though — Sports Illustrated’s human-written articles are now mostly clickbait junk anyway.
Tangentially related to the last item, here’s Eva Rothenberg reporting for CNN:
Since at least 2019, Meta has knowingly refused to shut down the majority of accounts belonging to children under the age of 13 while collecting their personal information without their parents’ consent, a newly unsealed court document from an ongoing federal lawsuit against the social media giant alleges. [...]
According to the 54-count lawsuit, Meta violated a range of state-based consumer protection statutes as well as the Children’s Online Privacy Protection Rule (COPPA), which prohibits companies from collecting the personal information of children under 13 without a parent’s consent. Meta allegedly did not comply with COPPA with respect to both Facebook and Instagram, even though “Meta’s own records reveal that Instagram’s audience composition includes millions of children under the age of 13,” and that “hundreds of thousands of teen users spend more than five hours a day on Instagram,” the court document states.
One Meta product designer wrote in an internal email that the “young ones are the best ones,” adding that “you want to bring people to your service young and early,” according to the lawsuit.
Not a good look.
The unsealed complaint also alleges that Meta knew that its algorithm could steer children toward harmful content, thereby harming their well-being. According to internal company communications cited in the document, employees wrote that they were concerned about “content on IG triggering negative emotions among tweens and impacting their mental well-being (and) our ranking algorithms taking [them] into negative spirals & feedback loops that are hard to exit from.”
On that last point, Jason Kint posted a long thread on Twitter/X highlighting previously redacted details from the lawsuit.
Jeff Horwitz and Katherine Blunt, reporting for The Wall Street Journal:
The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform. Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos — and ads for some of the biggest U.S. brands.
The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.
In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff. In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl.
Worse, Meta has known of the Journal’s findings since August and the problem continues:
The Journal informed Meta in August about the results of its testing. In the months since then, tests by both the Journal and the Canadian Centre for Child Protection show that the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere.
As of mid-November, the center said Instagram is continuing to steadily recommend what the nonprofit described as “adults and children doing sexual posing.”
There’s no plausible scenario where Instagram wants to cater to pedophiles, but it’s seemingly beyond their current moderation capabilities to determine the content of videos at scale. Solving this ought to be their highest priority.
My thanks to Kolide for sponsoring last week at DF. Getting OS updates installed on end user devices should be easy. After all, it’s one of the simplest yet most impactful ways that every employee can practice good security. On top of that, every MDM solution promises that it will automate the process and install updates with no user interaction needed. Yet in the real world, it doesn’t play out like that. Users don’t install updates and IT admins won’t force installs via forced restart.
With Kolide, when a user’s device — be it Mac, Windows, Linux, or mobile — is out of compliance, Kolide reaches out to them with instructions on how to fix it. The user chooses when to restart, but if they don’t fix the problem by a predetermined deadline, they’re unable to authenticate with Okta.
Watch Kolide’s on-demand demo to learn more about how it enforces device compliance for companies with Okta.
Rogue Amoeba:
Transcribe can convert speech from an astonishing 57 languages into text, providing you with a written transcript of any spoken audio. It’s powered by OpenAI’s automatic speech recognition system Whisper, and features two powerful models for fast and accurate transcriptions.
Best of all, unlike traditional transcription services, Transcribe works for free inside of Audio Hijack. There’s absolutely no on-going cost, so you can generate unlimited transcriptions and never again pay a per-minute charge. It’s pretty incredible.
It’s also completely private. When you use Transcribe, everything happens right on your Mac. That means your data is never sent to the cloud, nor shared with anyone else.
This makes for a perfect one-two shot with Retrobatch 2: Audio Hijack is also a node-based media tool (which predates Retrobatch), and this new Transcribe block is also putting powerful machine learning tools into an easily accessible form.
This Transcribe feature in Audio Hijack is also an exemplar of the power of Apple silicon — it works on Intel-based Macs too, but it’s just incredibly fast on Apple silicon (I suspect because of the Neural Engine on every M-series chip).
Gus Mueller, writing at the Flying Meat blog:
In case you’re not aware, Retrobatch is a node-based batch image processor, which means you can mix, match, and combine different operations together to make the perfect workflow. It’s kind of neat. And version 2 is even neater. [...]
Retrobatch is obviously not Flying Meat’s most important app (Acorn would fill that role), but I really do like working on it and there’s a bunch more ideas that I want to implement. I feel like Retrobatch is an app that the Mac needs, and it makes me incredibly happy to read all the nice letters I get from folks when they figure out how to use it in their daily work.
Five years after Retrobatch 1 shipped, I’m happy to see version 2 out in the world. And I can’t wait to see what folks are going to do with it.
“Node-based batch image processor” means that you design and tweak your own image processing workflows not with code, but through a visual drag-and-drop interface. (But you can use code, via nodes for JavaScript, AppleScript, and shell scripts.) You can program your own highly customized image processing workflows without knowing anything about writing code. It’s useful for creating workflows that work on just one image at a time, but Retrobatch really shines for batch processing.
There are a zillion new features in version 2, but the star of the show has to be the new “ML Super Resolution” 4× upscaler: a powerful machine learning model made easily accessible.
I can’t read or play music, and struggle even to clap to a beat, so I would have zero use for this device. But I still want to buy one. Just look at it. Absolutely gorgeous.
Special guest Gabe Rivera, founder of the indispensable news aggregator Techmeme, joins the show to talk about the state of news and social media. Thanksgiving fun for the entire family — turn the volume down on the Packers-Lions game tomorrow and listen to this instead. (Turn the volume back up, of course, for the Commanders-Cowboys game.)
Sponsored by:
Elizabeth Laraki, in an article-length post on Twitter/X
15 years ago, I helped design Google Maps. I still use it everyday. Last week, the team dramatically changed the map’s visual design. I don’t love it. It feels colder, less accurate and less human. But more importantly, they missed a key opportunity to simplify and scale. [...]
So much stuff has accumulated on top of the map. Currently there are ~11 different elements obscuring it:
- Search box
- 8 pills overlayed in 4 rows
- A peeking card for “latest in the area”
- A bottom nav bar
This is a very long way of saying that Google Maps’s app design should be like Apple Maps. In fact, Apple Maps has fewer UI elements obtruding actual map content than she’s proposing for Google Maps.
Nilay Patel and Alex Heath, reporting for The Verge:
Sam Altman will return as CEO of OpenAI, overcoming an attempted boardroom coup that sent the company into chaos over the past several days. Former president Greg Brockman, who quit in protest of Altman’s firing, will return as well.
The company said in a statement late Tuesday that it has an “agreement in principle” for Altman to return alongside a new board composed of Bret Taylor, Larry Summers, and Adam D’Angelo. D’Angelo is a holdover from the previous board that initially fired Altman on Friday. He remains on this new board to give the previous board some representation, we’re told.
People familiar with the negotiations say that the main job of this small initial board is to vet and appoint an expanded board of up to nine people that will reset the governance of OpenAI. Microsoft, which has committed to investing billions in the company, wants to have a seat on that expanded board, as does Altman himself.
The question I’ve focused on from the start of this soap opera is who really controls OpenAI? The board thought it was them. It wasn’t. Matt Levine had the funniest-because-it’s-true take in his Money Stuff column — I don’t want to spoil it, just go there and look at his “slightly annotated” version of OpenAI’s diagram of their corporate structure.
See also: The Wall Street Journal’s compelling story of the drama behind the scenes (News+ link).
More information on the aforelinked secret program that provides U.S. law enforcement with trillions of phone call records, including location data, from the EFF:
“Hemisphere” came to light amidst the public uproar over revelations that the NSA had been collecting phone records on millions of innocent people. However, Hemisphere wasn’t a program revealed by Edward Snowden’s leaks, but rather its exposure was pure serendipity: a citizen activist in Seattle discovered the program when shocking presentations outlining the program were provided to him in response to regular old public records requests.
This slide deck hosted by the EFF is one of those presentations, and worth your attention. The system’s capabilities are terrifying. From page 9 of that deck, highlighting Hemisphere’s “Special Features”:
Dropped Phones — Hemisphere uses special software that analyzes the calling pattern of a previous target phone to find the new number. Hemisphere has been averaging above a 90% success rate when searching for dropped phones.
Additional Phones — Hemisphere utilizes a similar process to determine additional cell phones the target is using that are unknown to law enforcement.
So if a target throws away their phone, switches to a new burner phone, but continues calling the same people, Hemisphere claims a 90 percent success rate identifying that new phone.
- Advanced Results — Hemisphere is able to provide two levels of call detail records for one target number by examining the direct contacts for the original target, and identifying possibly significant numbers that might return useful CDRs.
So the system analyzes not just the phone records of the target, but the records of every single number the target calls.
Page 20 of the deck is highly redacted:
Hemisphere can capture data regarding local calls, long distance calls, international calls, cellular calls [???]
Hemisphere does NOT capture █████████████████████████ subscriber information [???]
Highlights of any basic request include: █████████████████████████ █████████████████████████████████ temporary roaming and location data, and traffic associated with international numbers
I’m using “[???]” to denote spots where I suspect information has been redacted, and “█” to indicate obvious redactions. I sure would love to know what’s redacted there. Again, my mind runs to text messages.
Dell Cameron and Dhruv Mehrotra, reporting for Wired:
A little-known surveillance program tracks more than a trillion domestic phone records within the United States each year, according to a letter Wired obtained that was sent by US senator Ron Wyden to the Department of Justice (DOJ) on Sunday, challenging the program’s legality.
According to the letter, a surveillance program now known as Data Analytical Services (DAS) has for more than a decade allowed federal, state, and local law enforcement agencies to mine the details of Americans’ calls, analyzing the phone records of countless people who are not suspected of any crime, including victims. Using a technique known as chain analysis, the program targets not only those in direct phone contact with a criminal suspect but anyone with whom those individuals have been in contact as well.
The DAS program, formerly known as Hemisphere, is run in coordination with the telecom giant AT&T, which captures and conducts analysis of US call records for law enforcement agencies, from local police and sheriffs’ departments to US customs offices and postal inspectors across the country, according to a White House memo reviewed by Wired. Records show that the White House has, for the past decade, provided more than $6 million to the program, which allows the targeting of the records of any calls that use AT&T’s infrastructure — a maze of routers and switches that crisscross the United States.
In a letter to US attorney general Merrick Garland on Sunday, Wyden wrote that he had “serious concerns about the legality” of the DAS program, adding that “troubling information” he’d received “would justifiably outrage many Americans and other members of Congress.” That information, which Wyden says the DOJ confidentially provided to him, is considered “sensitive but unclassified” by the US government, meaning that while it poses no risk to national security, federal officials, like Wyden, are forbidden from disclosing it to the public, according to the senator’s letter.
Ron Wyden and his office are indispensable on matters related to government surveillance. A few non-obvious aspects worth considering regarding the DAS/Hemisphere program:
The information collected by DAS includes location data.
This is not just about AT&T wireless customers and their phone calls. This is related to the entire U.S. phone system infrastructure — the old Ma Bell. Landline calls and calls from Verizon and T-Mobile cellular customers get routed through this AT&T system, and are thus surveilled by this same system. You can use over-the-top services like iMessage, FaceTime, WhatsApp, or Signal to avoid DAS, but if you place calls using the traditional phone system, you could be impacted even if you’re not an AT&T customer — and you won’t ever know, because you have no idea how your phone calls are routed.
It is completely unclear to me whether DAS/Hemisphere collects text messages — SMS, MMS, RCS — in addition to voice calls. I’ve spent my afternoon trying to find out, and the only answer I’ve gotten is it’s unclear. I hope text messages are not included, but until we get a definitive answer, it’s only safe to assume that text messages are included. (If anyone reading this knows whether DAS includes text message records, please let me know.)
Since last I wrote about the ongoing leadership battle at OpenAI:
OpenAI named a new interim CEO, Twitch co-founder Emmett Shear. (Shear is an AI worrier, who has advocated drastically “slowing down”, writing “If we’re at a speed of 10 right now, a pause is reducing to 0. I think we should aim for a 1-2 instead.”) OpenAI CTO Mira Murati was CEO for about two days.
Satya Nadella announced very late Sunday night, “And we’re extremely excited to share the news that Sam Altman and Greg Brockman, together with colleagues, will be joining Microsoft to lead a new advanced AI research team. We look forward to moving quickly to provide them with the resources needed for their success.”
About 700 OpenAI employees, out of a total of 770, signed an open letter demanding the OpenAI board resign, and threatening to quit to join Altman at Microsoft if they don’t. Among the signees: Mira Murati (which might explain why she’s no longer interim CEO) and chief scientist and board member Ilya Sutskever.
Sutskever posted on Twitter/X: “I deeply regret my participation in the board’s actions. I never intended to harm OpenAI. I love everything we’ve built together and I will do everything I can to reunite the company.”
Alex Heath and Nilay Patel report for The Verge that Altman and Brockman might still return to OpenAI.
Nadella appeared on CNBC and admitted that Altman and Brockman were not officially signed as Microsoft employees yet, and when asked who would be OpenAI’s CEO tomorrow, laughed, because he didn’t know.
See also: Ben Thompson’s crackerjack take on the saga at Stratechery. Long story short: OpenAI’s company structure was a ticking time bomb.
My thanks to Vanta for sponsoring last week at DF. Vanta lets you shortcut compliance — without shortchanging security.
Vanta brings GRC and security efforts together. Integrate information from multiple systems and reduce risks to your business, all without the need for additional staffing. And because Vanta automates up to 90 percent of the work for SOC 2, ISO 27001, and more, you’ll be able to focus on strategy and security, not maintaining compliance.
From the most in-demand frameworks to third-party risk management and security questionnaires, Vanta gives SaaS businesses of all sizes one place to manage risk and prove security in real time.
Try Vanta free for 7 days. No costs or obligations.
Kevin Roose, reporting for The New York Times:
An all-hands meeting for OpenAI employees on Friday afternoon didn’t reveal much more. Ilya Sutskever, the company’s chief scientist and a member of its board, defended the ouster, according to a person briefed on his remarks. He dismissed employees’ suggestions that pushing Mr. Altman out amounted to a “hostile takeover” and claimed it was necessary to protect OpenAI’s mission of making artificial intelligence beneficial to humanity, the person said.
Mr. Altman appears to have been blindsided, too. He recorded an interview for the podcast I co-host, “Hard Fork,” on Wednesday, two days before his firing. During our chat, he betrayed no hint that anything was amiss, and he talked at length about the success of ChatGPT, his plans for OpenAI and his views on A.I.’s future.
Mr. Altman stayed mum about the precise circumstances of his departure on Friday. But Greg Brockman — OpenAI’s co-founder and president, who quit on Friday in solidarity with Mr. Altman — released a statement saying that both of them were “shocked and saddened by what the board did today.” Mr. Altman was asked to join a video meeting with the board at noon on Friday and was immediately fired, Mr. Brockman said.
Kara Swisher was all over the story last night, writing on Twitter/X:
Sources tell me that the profit direction of the company under Altman and the speed of development, which could be seen as too risky, and the nonprofit side dedicated to more safety and caution were at odds. One person on the Sam side called it a “coup,” while another said it was the the right move. [...]
More: The board members who voted against Altman felt he was manipulative and headstrong and wanted to do what he wanted to do. That sounds like a typical SV CEO to me, but this might not be a typical SV company. They certainly have a lot of explaining to do.
According to Brockman — who until he quit in protest of Altman’s firing was chairman of the OpenAI board — he didn’t find out until just 5 minutes before Altman was sacked. I’ve never once heard of a corporate board firing the company’s CEO behind the back of the chairman of the board.
It really does look more and more like a deep philosophical fissure inside OpenAI, between those led by Sutskever (and, obviously, a majority of the board) advocating a cautious slow and genuinely non-profit-driven approach, and Altman/Brockman’s “let’s move fast, change the world, and make a lot of money” side. Sutskever and the OpenAI board seemingly see Altman/Brockman as reckless swashbucklers; Altman and Brockman, I suspect, see Sutskever and his side as a bunch of ninnies.
A simple way to look at it is to read OpenAI’s charter, “the principles we use to execute on OpenAI’s mission”. It’s a mere 423 words, and very plainly written. It doesn’t sound anything at all like the company Altman has been running. The board, it appears, believes in the charter. How in the world it took them until now to realize Altman was leading OpenAI in directions completely contrary to their charter is beyond me.
It’s like the police chief in Casablanca being “shocked — shocked!” to find out that gambling was taking place in a casino where he played. ★
Monica Chin, reporting for The Verge last month, “Qualcomm Claims Its Snapdragon X Elite Processor Will Beat Apple, Intel, and AMD”:
Qualcomm has announced its new Snapdragon X Elite platform, which looks to be its most powerful computing processor to date. The chips (including the new Qualcomm Oryon, announced today) are built on a 4nm process and include 136GB/s of memory bandwidth. PCs are expected to ship in mid-2024. [...]
Oh, Qualcomm also claims that its chip will deliver “50% faster peak multi-thread performance” than Apple’s M2 chip. This is just a funny claim; the X Elite has 50 percent more cores than the M2 and sucks down much more power, so of course it is going to do better on Geekbench at “peak multi-thread performance.” That’s like a professional sprinter bragging about winning the 100-meter dash against a bunch of marathon champions.
This news is so old that Chin is no longer on the staff at The Verge (which I think explains why she didn’t write either of their reviews for the new M3 MacBook Pros), but I’m cleaning up old tabs and wanted to comment on this.
It’s nonsense. Chips that aren’t slated to appear in any actual laptops until “mid-2024” are being compared to the M2, which Apple debuted with the MacBook Air in June 2022. So even if Qualcomm’s performance claims are true and PCs based on their chips ship on schedule, they’re comparing against a chip that Apple debuted two entire years earlier.
Plus they’re only comparing multi-core performance against the base M2. And they’re not really comparing multi-core performance overall but “peak” performance, however it is they define that. And the fact that they only mention multi-core performance strongly suggests that they’re slower than the M2 at single-core performance, which for most consumer/prosumer use cases is more important.
And: No one in the PC world seems to care about ARM chips, at least for laptops. Microsoft made a go of it with their Surface line and largely gave up. My understanding is that fewer than 1 percent of PC sales today are ARM-based machines. If Microsoft wasn’t willing to optimize Windows to make it ARM-first, or even treat ARM as an equal to x86, when they themselves were trying to make ARM-based Windows laptops a thing, why would they do it now?
If Mac hardware and MacOS were made by separate companies, and the MacOS software company licensed their OS to other OEMs, I really don’t think Apple silicon hardware would have happened. The seemingly too-good-to-be-true performance of Apple silicon Macs is the result of the silicon being designed for the software and the software being optimized and at very low levels designed for the silicon. Qualcomm isn’t going to get that from Microsoft with Windows.
Qualcomm’s X Elite platform may well beat Intel and AMD, but I’m not sure that will matter in the PC world unless Microsoft truly goes all-in on ARM with Windows. Which I don’t see happening. But the idea that they’re even vaguely catching up to Apple silicon is laughable, and it’s frustrating that so much of the tech press took anything Qualcomm claimed about relative performance against Apple silicon seriously.
We know for a fact that their Snapdragon chips for phones have always lagged years behind Apple’s A-series chips in both sheer performance and performance-per-watt, with no sign that they’re catching up. So how in the world would their ARM chips for PCs beat Apple’s M-series chips?
And, yes, I predicted this back in November 2021, when Qualcomm claimed they’d be shipping “M-series competitive” chips for PCs by 2023. Qualcomm claimed to still be on track to ship in 2023 just one year ago, so I wouldn’t hold my breath for “mid-2024” either. ★
Ina Fried, reporting for Axios:
Apple is pausing all advertising on X, the Elon Musk-owned social network, sources tell Axios. The move follows Musk’s endorsement of antisemitic conspiracy theories as well as Apple ads reportedly being placed alongside far-right content. Apple has been a major advertiser on the social media site and its pause follows a similar move by IBM.
Musk faced backlash for endorsing an antisemitic post Wednesday, as 164 Jewish rabbis and activists upped their call to Apple, Google, Amazon and Disney to stop advertising on X, and for Apple and Google to remove it from their platforms.
The left-leaning nonprofit Media Matters for America published a report Thursday that highlighted Apple, IBM, Amazon and Oracle as among those whose ads were shown next to far-right posts.
It’s fair, in some sense, to describe Media Matters for America as “left-leaning”, but they have a decades-long reputation for accuracy. They don’t publish hit pieces that are later proven to have been manipulated. There’s no reason to doubt their report or the screenshots it contained — especially if, like me, you still do check in on Twitter. This is what it’s like over there now.
Update: Disney and Warner Bros Discovery have suspended advertising on Twitter/X too.
OpenAI press release:
The board of directors of OpenAI, Inc., the 501(c)(3) that acts as the overall governing body for all OpenAI activities, today announced that Sam Altman will depart as CEO and leave the board of directors. Mira Murati, the company’s chief technology officer, will serve as interim CEO, effective immediately. [...]
Mr. Altman’s departure follows a deliberative review process by the board, which concluded that he was not consistently candid in his communications with the board, hindering its ability to exercise its responsibilities. The board no longer has confidence in his ability to continue leading OpenAI.
By corporate board standards, that statement is absolutely scathing. There’s widespread speculation some sort of scandal must be at the heart of this, but no word yet what.
i loved my time at openai. it was transformative for me personally, and hopefully the world a little bit. most of all i loved working with such talented people.
will have more to say about what’s next later.
🫡
That all-lowercase style is consistent for Altman’s personal tweets, but man does it look extra fucking silly in a post like this one.
Just last week Altman hosted OpenAI’s first DevDay keynote (where he did great), and Cade Metz reports for The New York Times, as late as yesterday Altman was still publicly representing OpenAI:
On Thursday evening, Mr. Altman appeared at an event in Oakland, Calif., where he discussed the future of art and artists now that artificial intelligence can generate images, videos, sounds and other forms of art on its own. Giving no indication that he was leaving OpenAI, he repeatedly said he and the company would continue to work alongside artists and help to ensure their future would be bright.
Earlier in the day, he appeared at the Asia-Pacific Economic Cooperation CEO Summit in San Francisco with Laurene Powell Jobs, who is the founder and president of the Emerson Collective, and executives from Meta and Google.
(Altman also just publicly dunked on Elon Musk. I’m sure Musk will just let it slide.)
Lastly, Altman, despite being credited as a co-founder, has no equity in OpenAI. Meta, for example, can’t fire Zuckerberg — same deal for Google with Larry Page and Sergey Brin — but OpenAI could sack Altman with a board vote because he owns none of it, much less a controlling interest.
Yesterday Apple released developer beta 2 of iOS 17.2, the first version of iOS to include support for capturing spatial video with iPhone 15 Pro models. Today came the public beta, enabling the same feature. Apple invited me to New York yesterday, not merely to preview capturing spatial video using an iPhone, but to experience watching those spatial videos using a Vision Pro.
The experience was, like my first Vision Pro demo back at WWDC in June, astonishing.
Shooting spatial video on an iPhone 15 Pro is easy. The feature is — for now at least — disabled by default, and can be enabled in Settings → Camera → Formats. The option is labeled “Spatial Video for Apple Vision Pro”, which is apt, because (again, for now at least) spatial video only looks different from non-spatial video when playing it back on Vision Pro. When viewed on any other device it just looks like a regular flat video.
Once enabled, you can toggle spatial video capture in the Camera app whenever you’re in the regular Video mode. It’s very much akin to the toggle for Live Photos when taking still photos, or the Action mode toggle for video — not a separate mode in the main horizontal mode switcher in Camera, but a toggle button in the main Video mode.
When capturing spatial video, you have no choice regarding resolution, frame rate, or file format. All spatial video is captured at 1080p, 30 fps, in the HEVC file format.1 You also need to hold the phone horizontally, to put the two capturing lenses on the same plane. The iPhone uses the main (1×) and ultra wide (0.5×) lenses for capture when shooting spatial video, and in fact, Apple changed the arrangement of the three lenses on the iPhone 15 Pro in part to support this feature. (On previous iPhone Pro models, when held horizontally, the ultra wide camera was on the bottom, and the main and telephoto lenses were next to each other on the top.)
I believe resolution is limited to 1080p because to get an image from the ultra wide 0.5× camera that is the equivalent field of view as from the main 1× camera, it needs to crop the ultra wide image significantly. The ultra wide camera has a mere 12 MP sensor, so there just aren’t enough pixels to crop a 1× equivalent field of view from the center of the sensor and get a 4K image.
There are two downsides to shooting spatial video with your iPhone. First, the aforementioned 1080p resolution and 30 fps frame rate. I’ve been shooting 4K video by default for years, because why not? I wish we could capture spatial video at 4K, but alas, not yet. The second downside to shooting spatial video is that it effectively doubles the file size compared to non-spatial 1080p, for the obvious reason that each spatial video contains two 1080p video streams. That file-size doubling is a small price to pay — the videos are still smaller than non-spatial 4K 30 fps video.2
Really, it’s just no big deal to capture spatial video on your iPhone. If the toggle button is off, you capture regular video with all the regular options for resolution (720p/1080p/4K) and frame rates (24/30/60). If the toggle for spatial video is on — and when on it’s yellow, impossible to miss — you lose those choices but it just looks like capturing a regular video. And when you play it back, or share it with others who are viewing it on regular devices like their phones or computers, it just looks like a regular flat video.
If you own an iPhone 15 Pro, there’s no good reason not to start capturing spatial videos this year — like, say, this holiday season — to record any sort of moments that feel like something you might want to experience as “memories” with a Vision headset in the future, even if you don’t plan to buy the first-generation Vision Pro next year.
Before my demo, I provided Apple with my eyeglasses prescription, and the Vision Pro headset I used had appropriate corrective lenses in place. As with my demo back in June, everything I saw through the headset looked incredibly sharp.
Apple has improved and streamlined the onboarding/calibration process significantly since June. There are a few steps where you’re presented with a series of dots in a big circle floating in front of you, like the hour indexes on a clock. As you look at each circle, it lights up a bit, and you do the finger tap gesture. It’s the Vision Pro’s way to calibrate that what it thinks you’re looking is what you actually are looking at. Once that calibration step was over — and it took just a minute or two — I was in, ready to go on the home screen of VisionOS. (And the precision of this calibration is amazing — UI elements can be placed relatively close to each other and it knows exactly which one you’re looking at when you tap. iOS UI design needs to be much more forgiving of our relatively fat fingertips than VisionOS UI design needs to be about the precision of our gaze.)
My demo yesterday was expressly limited to photography in general, and spatial video in particular, and so my demo was, per Apple’s request, limited to the Photos app in VisionOS. It was tempting, at times, to see where else I could go and what else I could do. But there was so much to see and do in Photos alone that my demo — about 30 minutes in total wearing Vision Pro — raced by.
Prior to separating us in smaller rooms for our time using Vision Pro, I was paired with Joanna Stern from The Wall Street Journal for a briefing on the details of spatial video capturing using the iPhones 15 Pro. We were each provided with a demo iPhone, and were allowed to capture our own spatial videos. We were in a spacious modern high-ceiling’d kitchen, bathed in natural light from large windows. A chef was busy preparing forms of sushi, and served as a model for us to shoot. Joanna and I also, of course, shot footage of each other, while we shot each other. It was very meta. The footage we captured ourselves was then preloaded onto the respective Vision Pro headsets we used for our demos.
We were not permitted to capture spatial video on Vision Pro.3 However, our demo units had one video in the Photos library that was captured on Vision Pro — a video I had experienced before, back in June, of a group of twenty-somethings sitting around a fire pit at night, having fun in a chill atmosphere. There were also several other shot-by-Apple spatial videos which were captured using an iPhone 15 Pro.
One obvious question: How different do spatial videos captured using iPhone 15 Pro look from those captured using a Vision Pro itself? Given that Apple provided only one example spatial video captured on Vision Pro, I don’t feel like I can fully answer that based on my experience yesterday. It did not seem like the differences were dramatic or significant. The spatial videos shot using iPhone 15 Pro that I experienced, including those I captured myself, seemed every bit as remarkable as the one captured using Vision Pro.
Apple won’t come right out and say it but I do get the feeling that all things considered, spatial video captured using Vision Pro will be “better”. The iPhone might win out on image quality, given the fact that the 1× main camera on the iPhone 15 Pro is the single best camera system Apple makes, but the Vision Pro should win out on spatiality — 3D-ness — because Vision Pro’s two lenses for spatial video capture are roughly as far apart from each other as human eyes. The two lenses used for capture on an iPhone are, of course, much closer to each other than any pair of human eyes. But despite how close the two lenses are to each other, the 3D effect is very compelling on spatial video captured on an iPhone. It’s somehow simultaneously very natural-looking and utterly uncanny.
It’s a stunning effect and remarkable experience to watch them. And so the iPhone, overall, is going to win out as the “best” capture device for spatial video — even if footage captured on Vision Pro is technically superior — because, as the old adage states, the best camera is the one you have with you. I have my iPhone with me almost everywhere. That will never be even close to true for Vision Pro next year.
Here’s what I wrote about spatial video back in June, after my first hands-on time with Vision Pro:
Spatial photos and videos — photos and videos shot with the Vision Pro itself — are viewed as a sort of hybrid between 2D content and fully immersive 3D content. They don’t appear in a crisply defined rectangle. Rather, they appear with a hazy dream-like border around them. Like some sort of teleportation magic spell in a Harry Potter movie or something. The effect reminded me very much of Steven Spielberg’s Minority Report, in the way that Tom Cruise’s character could obsessively watch “memories” of his son, and the way the psychic “precogs” perceive their visions of murders about to occur. It’s like watching a dream, but through a portal opened into another world.
When you watch regular (non-spatial) videos using Vision Pro, or view regular still photography, the image appears in a crisply defined window in front of you. Spatial videos don’t appear like that at all. I can’t describe it any better today than I did in June: it’s like watching — and listening to — a dream, through a hazy-bordered portal opened into another world.
Several factors contribute to that dream-like feel. Spatial videos don’t look real. It doesn’t look or feel like the subjects are truly there in front of you. That is true of the live pass-through video you see in Vision Pro, of the actual real world around you. That pass-through video of actual reality is so compelling, so realistic, that in both my demo experiences to date I forgot that I was always looking at video on screens in front of my eyes, not just looking through a pair of goggles with my eyes’ own view of the world around me.
So Vision Pro is capable of presenting video that looks utterly real — because that’s exactly how pass-through video works and feels. Recorded spatial videos are different. For one thing, reality is not 30 fps, nor is it only 1080p. This makes spatial videos not look low-resolution or crude, per se, but rather more like movies. The upscaled 1080p imagery comes across as film-like grain, and the obviously-lower-than-reality frame rate conveys a movie-like feel as well. Higher resolution would look better, sure, but I’m not sure a higher frame rate would. Part of the magic of movies and TV is that 24 and 30 fps footage has a dream-like aspect to it.
Nothing you’ve ever viewed on a screen, however, can prepare you for the experience of watching these spatial videos, especially the ones you will have shot yourself, of your own family and friends. They truly are more like memories than videos. The spatial videos I experienced yesterday that were shot by Apple looked better — framed by professional photographers, and featuring professional actors. But the ones I shot myself were more compelling, and took my breath away. There’s my friend, Joanna, right in front of me — like I could reach out and touch her — but that was 30 minutes ago, in a different room.
Prepare to be moved, emotionally, when you experience this.
My briefing and demo experience yesterday was primarily about capturing spatial video on iPhone 15 Pro and watching it on Vision Pro, but my demo went through the entire Photos app experience in VisionOS.
Plain old still photos look amazing. You can resize the virtual window in which you’re viewing photos to as large as you can practically desire. It’s not merely like having a 20-foot display — a size far more akin to that of a movie theater screen than a television. It’s like having a 20-foot display with retina quality resolution, and the best brightness and clarity of any display you’ve ever used. I spend so much time looking at my own iPhone-captured still photos on my iPhone display that it’s hard to believe how good they can look blown up to billboard-like dimensions. Just plain still photos, captured using an iPhone.
And then there are panoramic photos. Apple first introduced Pano mode back in 2012, with the introduction of the iPhone 5. That feature has never struck me as better than “Kind of a cool trick”. In the decade since the feature has been available, I’ve only taken about 200 of them. They just look too unnatural, too optically distorted, when viewed on a flat display. And the more panoramic you make them, the more unnatural they look when viewed flat.
Panoramic photos viewed using Vision Pro are breathtaking.
There is no optical distortion at all, no fish-eye look. It just looks like you’re standing at the place where the panoramic photo was taken — and the wider the panoramic view at capture, the more compelling the playback experience is. It’s incredible, and now I wish I’d spent the last 10 years taking way more of them.
As a basic rule, going forward, I plan to capture spatial videos of people, especially my family and dearest friends, and panoramic photos of places I visit. It’s like teleportation.
The Vision Pro experience is highly dependent upon foveated rendering, which Wikipedia succinctly describes as “a rendering technique which uses an eye tracker integrated with a virtual reality headset to reduce the rendering workload by greatly reducing the image quality in the peripheral vision (outside of the zone gazed by the fovea).” Our retinas work like this too — we really only see crisply what falls on the maculas at the center of our retinas. Vision Pro really only renders at high resolution what we are directly staring at. The rest is lower-resolution, but that’s not a problem, because when you shift your gaze, Vision Pro is extraordinarily fast at updating the display.
I noticed yesterday that if I darted my eyes from one side to the other fast enough, I could sort of catch it updating the foveation. Just for the briefest of moments, you can catch something at less than perfect resolution. I think. It is so fast fast fast at tracking your gaze and updating the displays that I can’t be sure. It’s just incredible, though, how detailed and high resolution the overall effect is. My demo yesterday was limited to the Photos app, but I came away more confident than ever that Vision Pro is going to be a great device for reading and writing — and thus, well, work.
The sound quality of the speakers in the headset strap is impressive. The visual experience of Vision Pro is so striking — I mean, the product has “Vision” in its name for a reason — that the audio experience is easy to overlook, but it’s remarkably good.
Navigating VisionOS with your gaze and finger taps is so natural. I’ve spent a grand total of about an hour, spread across two 30-minute demos, using Vision Pro, but I already feel at home using the OS. It’s an incredibly natural interaction model based simply on what you are looking at. My enthusiasm for this platform, and the future of spatial computing, could not be higher. ★
The HEVC spec allows for a single file to contain multiple video streams. That’s what Apple is doing, with metadata describing which stream is “left” and which is “right”. Apple released preliminary documentation for this format back in June, just after WWDC. ↩︎
According to Apple, these are the average file sizes per minute of video:
• Regular 1080p 30 fps: 65 MB
• Spatial 1080p 30 fps: 130 MB
• Regular 4K 30 fps: 190 MB ↩︎︎
Paraphrased:
“This is the digital crown. You’ll be using this today to adjust the immersiveness by turning it, and you’ll press the crown if you need to go back to the home screen. On the other side is the button you would use to capture photos or videos using Vision Pro. We won’t be using that button today.”
“But does that button work? If I did press that button, would it capture a photo or video?”
“Please don’t press that button.” ↩︎︎