University of Pennsylvania President Liz Magill Resigns After Embarrassing Testimony in Congressional Antisemitism Hearing 

Like I wrote the other day, a reckoning was due. In addition to Magill, the chair of Penn’s board of trustees also submitted his resignation. After only 18 months as president, Magill’s was the shortest tenure in Penn’s 260-year history.

If Penn wants to see how you do it, they need look no further than right across Walnut Street.

Verizon Gave a Woman’s Phone Data to an Armed Stalker Who Posed as Cop Over Email 

Joseph Cox, reporting for 404 Media:

The FBI investigated a man who allegedly posed as a police officer in emails and phone calls to trick Verizon to hand over phone data belonging to a specific person that the suspect met on the dating section of porn site xHamster, according to a newly unsealed court record. Despite the relatively unconvincing cover story concocted by the suspect, including the use of a clearly non-government ProtonMail email address, Verizon handed over the victim’s data to the alleged stalker, including their address and phone logs. The stalker then went on to threaten the victim and ended up driving to where he believed the victim lived while armed with a knife, according to the record.

The news is a massive failure by Verizon who did not verify that the data request was fraudulent, and the company potentially put someone’s safety at risk. [...] As the complaint against Glauner notes, this “search warrant” was not correctly formatted and did not include an additional form that is required for search warrants in North Carolina. That, and the Cary Police Department confirmed that no such Steven Cooper is employed with the agency, the document says. The judge who allegedly signed the document, Gale Adams, was shown the document and told investigators the signature was not hers either. Most obviously of all, the document was sent with a ProtonMail email address, which is “not an official government email address,” the complaint says.

Disgraceful.

Ex-Apple Lawyer in Charge of Enforcing Compliance With the Company’s Insider Trading Policies Sentenced to Probation for Insider Trading 

David Thomas, reporting for Reuters:

Apple’s former top corporate lawyer will receive no prison time after pleading guilty last year to U.S. insider trading charges, a judge said on Thursday. U.S. District Judge William Martini in Newark, New Jersey, sentenced Gene Levoff to four years of probation and 2,000 hours of community service. Levoff was also ordered to pay a $30,000 fine and forfeit $604,000. [...]

Levoff ignored quarterly “blackout periods” that barred trading before Apple’s results were released and violated the company’s broader insider trading policy that he himself was responsible for enforcing, prosecutors said.

Who watches the watchmen?

Tip of the Day: You Can Select Multiple Tabs, Then Drag Them, in Safari, Chrome, and Firefox 

Jack Wellborn:

I just recently discovered that you can select and drag multiple Safari tabs by holding Shift or Command, just as you would to select and drag multiple items in Finder.

I had no idea you could do this with tabs. Just like making multiple selections in a list view, Shift-click will select an entire range at once, and Command-clicking lets you select (and deselect) noncontiguous tabs. If I’d known you could do this, I probably never would have written the AppleScript I posted the other day — but if I hadn’t written and posted that script, I don’t think I would learned this trick. Once you have multiple tabs selected, you can drag them together to create a new window, or do things like close them all at once.

This same trick works in Firefox and Chrome (and Chrome-derived browsers like Brave), too. This trick does not work in Safari on iPadOS, because iPads are baby computers where you can’t select more than one thing at a time.

Update: In a reply on Threads, Jay Robinson points out (and includes a nice screencast) that you can select multiple Safari tabs on iPad with multitouch. Drag one tab out of the tab bar, then, while keeping the drag active with one finger, use another finger to tap additional tabs to add them to the collection of tabs being dragged. But: all you can seemingly do with such a collection of dragged tabs is move them to another area in the current Safari window, or drop them as URLs into another app, like a message in Mail or Apple Notes. You can drag a single tab in iPad Safari to the edge of the screen to move it to a new split screen window, but if you have more than one tab in the drag collection, you can’t do that. Nor can you take group actions on the collection of tabs, like closing them all at once, or closing all tabs in the window other than the selected ones, like you can with the multiple-tab-selection feature in the big-boy Safari on MacOS. You can drag a collection of tabs on iPadOS into a tab group, if you have the sidebar open. That’s useful in combination with tab search, to filter the list of visible tabs — search, select the tabs that match the search term, and drag them together to a new or existing tab group. (You can create a drag collection of multiple tabs in iPhone Safari the same way.)

Apple Quietly Releases MLX, an Open Source Array Framework for Machine Learning on Apple Silicon 

“Quietly” is a much-abused adverb in headlines, but I think apt for this. Apple’s machine learning research team has simply released this new framework on GitHub, with no fanfare:

The MLX examples repo has a variety of examples, including:

Seems quite useful already today, and expands the groundwork for on-device AI features in the future.

Idiot Cops Are Spreading Misinformation FUD About NameDrop 

Jason Snell:

This is so bizarre. NameDrop is a feature that lets you AirDrop your contact information to someone else. For the feature to work, both phones need to be unlocked and one has to be placed directly over the other. The entire new tap-to-connect system is built to use physical proximity to confirm consent to sending or receiving data, replacing the old system in which you could leave your device open to AirDrop from all users — and receive all sorts of nasty unwanted stuff from nearby randos.

Once the physical act of tapping is done — it takes a few seconds, there’s a prominent animation, it’s nothing that is going to happen accidentally — you are given the option to share your contact information with the other person, and get to choose which information is shared! If you only want to share a phone number and not your home address, you can do that! It’s entirely in the user’s control. (If someone nefarious approached you and wanted to steal your information, they’d be better off just grabbing your unlocked phone and running away with it.)

Gemini: Google’s New AI Model 

Google:

Gemini is also our most flexible model yet — able to efficiently run on everything from data centers to mobile devices. Its state-of-the-art capabilities will significantly enhance the way developers and enterprise customers build and scale with AI.

We’ve optimized Gemini 1.0, our first version, for three different sizes:

  • Gemini Ultra — our largest and most capable model for highly complex tasks.
  • Gemini Pro — our best model for scaling across a wide range of tasks.
  • Gemini Nano — our most efficient model for on-device tasks.

Loosely speaking, Gemini Ultra is competing with GPT 4, and Gemini Pro with GPT 3.5. Nano, the on-device model, will first appear on Pixel 8 Pro phones. It’s unclear to me whether that’s because Gemini Nano is tuned to specifically take advantage of the Pixel 8 Pro’s Tensor G3 chip, or if it will expand to additional Android phones with other silicon.

Google has a 6-minute demo of Gemini in action, and it’s rather incredible. But it also comes with this disclaimer: “For the purposes of this demo, latency has been reduced and Gemini outputs have been shortened for brevity.” Why not show it in real time, even if it’s slow? It seems like the whole demo ought be considered fraudulent — a fake. What’s wrong with Google as a company that they repeatedly try to pass off concept videos as legitimate demos of actual products?

iOS 17.2 Adds NameDrop-Like Feature for Sharing Boarding Passes, Movie Tickets, and Other Wallet Items 

Joe Rossignol, MacRumors:

Starting with the upcoming iOS 17.2 software update, there is a new NameDrop-like feature that allows an iPhone user to quickly share boarding passes, movie tickets, and other Wallet app passes with another iPhone user.

To use the feature, open the Wallet app and tap on the pass that you want to share. Then, hold your iPhone near the top of another iPhone, and a “Share” button will appear below the pass on your iPhone. Finally, tap on the “Share” button to send the pass to the other iPhone via AirDrop. Both iPhones must be updated to iOS 17.2.

Harvard, M.I.T., and Penn Presidents Under Fire After Dodging Questions About Antisemitism 

Stephanie Saul and Anemona Hartocollis, reporting for The New York Times:

Support for the presidents of Harvard, the University of Pennsylvania and M.I.T. eroded quickly on Wednesday, after they seemed to evade what seemed like a rather simple question during a contentious congressional hearing: Would they discipline students calling for the genocide of Jews?

Their lawyerly replies to that question and others during a four-hour hearing drew incredulous responses. “It’s unbelievable that this needs to be said: Calls for genocide are monstrous and antithetical to everything we represent as a country,” said a White House spokesman, Andrew Bates. [...]

Much of the criticism landed heavily on Ms. Magill because of an extended back-and-forth with Representative Stefanik. Ms. Stefanik said that in campus protests, students had chanted support for intifada, an Arabic word that means uprising and that many Jews hear as a call for violence against them. Ms. Stefanik asked Ms. Magill, “Does calling for the genocide of Jews violate Penn’s rules or code of conduct, yes or no?”

Ms. Magill replied, “If the speech turns into conduct, it can be harassment.”

Ms. Stefanik pressed the issue: “I am asking, specifically: Calling for the genocide of Jews, does that constitute bullying or harassment?”

Ms. Magill, a lawyer who joined Penn last year with a pledge to promote campus free speech, replied, “If it is directed and severe, pervasive, it is harassment.”

Ms. Stefanik responded: “So the answer is yes.”

Ms. Magill said, “It is a context-dependent decision, congresswoman.”

Ms. Stefanik exclaimed: “That’s your testimony today? Calling for the genocide of Jews is depending upon the context?”

The reckoning has come for the bizarro-world political climate that’s taken hold at these universities in the last decade or two. This patently offensive equivocation — when the correct answer was obviously an unambiguous “Yes” — makes sense in the context of the insular far-left worldview where the oppressed are viewed as inherently just, but comes across as absurd to everyone living in the real world. All three of these elite university presidents are obviously utterly tone-deaf and detached from the real world.

You can only pretend to live in a bubble for so long. Then the bill comes due.

New Mexico Sues Meta Over CSAM Content on Facebook and Instagram 

Rohan Goswami, reporting for CNBC:

Facebook and Instagram created “prime locations” for sexual predators that enabled child sexual abuse, solicitation, and trafficking, New Mexico’s attorney general alleged in a civil suit filed Wednesday against Meta and CEO Mark Zuckerberg.

The suit was brought after an “undercover investigation” allegedly revealed myriad instances of sexually explicit content being served to minors, child sexual coercion, or the sale of child sexual abuse material, or CSAM, New Mexico attorney general Raúl Torrez said in a press release.

The suit alleges that “certain child exploitative content” is ten times “more prevalent” on Facebook and Instagram as compared to pornography site PornHub and adult content platform OnlyFans, according to the release.

This follows the recent and ongoing investigative reporting by The Wall Street Journal into child porn rings on Instagram, and the ways in which their content algorithms send these deviants further down their perverted rabbit holes.

Which in turn leads the Muskateers paying for Twitter/X to ask questions like “Why are advertisers still on Facebook and Instagram but have such a massive problem with X, which bans such content?”

No content is more electrifyingly objectionable than CSAM. No bones about it, Meta has both a content moderation problem and PR fiasco on its hands. They have got to stamp this out, or advertisers will start abandoning their platform. But there are huge differences between Meta and X. Meta does not want CSAM or even CSAM-adjacent content on its platforms. Their current content moderation infrastructure quashes a shocking amount of it already. They need to do better, and I think most people believe they want to. The objectionable material on Twitter/X, on the other hand — the racism, the antisemitism, the outright Nazism — is explicitly permitted in the name of “free speech”. And in terms of perception, which is what advertisers care most about, Twitter/X is defined now by its number-one user, Elon Musk. He is the star of the platform, like what Tucker Carlson was to Fox News.

Also, more cynically, ads on Instagram work — advertisers gain more in sales than they spend on the ads. That’s less true — and perhaps not true at all — on Twitter/X.

Meta’s big legal problem isn’t that they’ve looked the other way at CSAM, but that they’ve deliberately looked the other way at under-13 users signing up for Instagram accounts, and purposely optimized their algorithms to engage teens. It doesn’t pass the sniff test that they’d want CSAM on Instagram; it easily passes the sniff test that they’d want to hook kids on the platform as young as possible.

Norman Lear: The Mensch 

Dave Pell, writing at NextDraft about Norman Lear, who died at the ripe age of 101:

From his tours of duty during WWII to his sensational, culture changing television creations, to his political activism, to the good, decent, kind life he lived, Norman Lear represented the greatest of the greatest generation. I was lucky enough to spend some time with Norman. Yes, he was a comedic genius and maybe television’s most important creator, but he was also a deeply interested, open, curious, people person. He was great, and also good. He truly lived the lyrics of the theme for his show One Day at a Time. This is it. This is life, the one you get, so go and have a ball.

What a career. He didn’t just create some of the best sitcoms on TV during his prime, he created most of the best sitcoms: Sanford & Son (my dad’s favorite), One Day at a Time, Maude, Good Times, Mary Hartman Mary Hartman, The Jeffersons, and, of course, his masterpiece, All in the Family.

Over at BoingBoing, Mark Frauenfelder has a 50-year-old All in the Family clip that, aside from Rob Reiner’s hairstyle, could have been recorded today. Archie Bunker was a more coherent Trump than Trump.

(With Charlie Munger dying at 99, Henry Kissinger at 100, and now Lear at 101, I’d be nervous if I were a famous 102-year-old.)

Update: A delightful anecdote from Alex Edelman, about Lear pitching him an idea for a new show at age 100.

23andMe Confirms Hackers Stole Ancestry Data on 6.9 Million Users 

Lorenzo Franceschi-Bicchierai, reporting for TechCrunch:

On Friday, genetic testing company 23andMe announced that hackers accessed the personal data of 0.1% of customers, or about 14,000 individuals. The company also said that by accessing those accounts, hackers were also able to access “a significant number of files containing profile information about other users’ ancestry.” But 23andMe would not say how many “other users” were impacted by the breach that the company initially disclosed in early October.

As it turns out, there were a lot of “other users” who were victims of this data breach: 6.9 million affected individuals in total.

In an email sent to TechCrunch late on Saturday, 23andMe spokesperson Katie Watson confirmed that hackers accessed the personal information of about 5.5 million people who opted-in to 23andMe’s DNA Relatives feature, which allows customers to automatically share some of their data with others. The stolen data included the person’s name, birth year, relationship labels, the percentage of DNA shared with relatives, ancestry reports and self-reported location.

Here’s a real shocker: 23andMe has updated their terms of service in attempt to prevent a class action lawsuit. Good luck with that.

Apple Requires Only a Subpoena to Turn Over Push Notification Tokens to Law Enforcement; Google Requires a Court Order 

Drew Harwell, reporting for The Washington Post:

Apple said in a statement that “the federal government had prohibited us from sharing any information” about the requests and now that the method had become public, it was updating its upcoming transparency reports to “detail these kinds of requests.”

Apple’s Law Enforcement Guidelines, the company’s rules for how police and government investigators should seek user information, now note that a person’s Apple ID, associated with a push-notification token, can be “obtained with a subpoena or greater legal process.”

Neither Wyden nor Apple detailed how many notifications had been reviewed, who had been targeted, what crimes were being investigated or which governments had made the requests.

Law enforcement agents can issue subpoenas on their own, so there’s no oversight here. Google, on the other hand, requires a court order:

For U.S. requests of push notifications and other non-content information, Google said it requires a court order, not just a subpoena, that is subject to judicial oversight. With such orders, federal officials must persuade a judge that the requested data is relevant and material to an ongoing criminal probe.

Score one for Google here.

Senator Ron Wyden: Governments Are Spying on Apple and Google Users Through Push Notifications 

Raphael Satter, reporting for Reuters:

Unidentified governments are surveilling smartphone users via their apps’ push notifications, a U.S. senator warned on Wednesday. In a letter to the Department of Justice, Senator Ron Wyden said foreign officials were demanding the data from Alphabet’s Google and Apple. Although details were sparse, the letter lays out yet another path by which governments can track smartphones. [...]

In a statement, Apple said that Wyden’s letter gave them the opening they needed to share more details with the public about how governments monitored push notifications. “In this case, the federal government prohibited us from sharing any information,” the company said in a statement. “Now that this method has become public we are updating our transparency reporting to detail these kinds of requests.”

Google said that it shared Wyden’s “commitment to keeping users informed about these requests.”

From Wyden’s letter to Attorney General Merrick Garland:

Apple and Google should be permitted to be transparent about the legal demands they receive, particularly from foreign governments, just as the companies regularly notify users about other types of government demands for data. These companies should be permitted to generally reveal whether they have been compelled to facilitate this surveillance practice, to publish aggregate statistics about the number of demands they receive, and unless temporarily gagged by a court, to notify specific customers about demands for their data. I would ask that the DOJ repeal or modify any policies that impede this transparency.

See also: Joseph Cox, reporting at 404 Media: “Here’s a Warrant Showing the U.S. Government is Monitoring Push Notifications”.

The Standalone iTunes Movies and TV Shows Apps Are Discontinued in tvOS 17.2 

Benjamin Mayo, 9to5Mac:

As first reported in October, Apple will discontinue the standalone iTunes Movies and iTunes TV Shows apps on the Apple TV box, starting with tvOS 17.2 The warning message seen above has started appearing in the release candidate version of tvOS 17.2 beta, released yesterday.

Apple directs users to the TV app instead to manage their purchases, and buy and rent from the store. At least as far as Apple’s video content is concerned, the iTunes brand is on the way out.

Apple has updated the TV app in 17.2 in preparation of the migration away from the standalone iTunes videos app, bringing across some functionality that was previously missing in TV. That includes things like filtering by genre in purchased tab, and the inclusion of box sets in the store listings. The TV app also features a new sidebar design in this update, which includes a dedicated store and purchases tab for quick navigation.

It’s the updates to the TV app that make this possible. It’s a good simplification overall: Apple’s own content — both iTunes purchases and TV+ streaming content — is in the TV app.

Gurman Predicts Big March for Apple: New iPads Pro and Air, M3 MacBook Airs, and New iPad Peripherals 

Mark Gurman, reporting for Bloomberg:

The iPad Air, which is the company’s mid-tier tablet, currently comes with a 10.9-inch screen. For next year’s release, the company will add a version that’s about 12.9 inches, matching the size of what’s currently the biggest iPad Pro.

The company is also preparing revamped versions of the Apple Pencil and Magic Keyboard accessories, which it will sell alongside the new iPad Pro. The new Pencil — codenamed B532 — will represent the third generation of the product. The company released a new low-end model in November.

The new Magic Keyboards — codenamed R418 and R428 — will make the iPad Pro look more like a laptop and include a sturdier frame with aluminum.

A big iPad Air is interesting, and I suspect will prove popular. No word, alas, on a new iPad Mini though. (I wish Apple would drop the “Mini” brand and just make the iPad Air in three sizes: mini, regular, and large, with identical specs.)

Gurman offers no details about the form factor for the updated iPad Pro models. Given that last year’s 10th-generation regular iPad moved the front-facing camera to the long side of the device — the appropriate location for a camera when the iPad is being used laptop-style — it seems like a safe guess that Apple will do the same with these next-gen iPad Air and Pro models. But the spot where that camera would go is currently the same spot where current iPad Pros have the magnetic attachment for a 2nd-gen Apple Pencil. So I think that’s why Apple is going to introduce a 3rd-gen Pencil — they might need an altogether new way of pairing, charging, and attaching Pencils if they move the front-facing camera to the long side. (Well, that’s one reason to create a 3rd-gen Pencil. Other reasons, of course, would include various ways of making a better stylus — the current 2nd-gen Pencil is now over 5 years old.)

I’m also quite curious about the purported reimagined Magic Keyboards. The current ones are transformative for iPads, functionally, but the rubbery surface material just isn’t durable enough — especially the white ones. MacBooks are remarkably durable; iPad Magic Keyboards demand to be treated carefully. On mine, the rubber is peeling away around my most-used keys. That shouldn’t happen with any keyboard, but it definitely shouldn’t happen with one that costs $300-350.

Bloomberg: ‘Apple Set to Avoid EU Crackdown Over iMessage Service’ 

Samuel Stolton, reporting for Bloomberg:*

Apple Inc.’s iMessage service looks set to win a carve out from new European Union antitrust rules to rein in Big Tech platforms after watchdogs tentatively concluded that it isn’t popular enough with business users to warrant being hit by the regulation. [...]

In order to fall under the scope of the rules, a service must be deemed an “important gateway” for business users. EU enforcers now consider this is not the case for iMessage, according to the people.

If iMessage ended up being targeted by the Digital Markets Act, Apple would have faced potentially onerous obligations to make iMessage work with rival online messaging services, such as Meta Platforms Inc.’s WhatsApp or Facebook Messenger — a move that Apple has already strongly contested.

The elephant in the room with this particular issue is that the interoperability demands of the DMA between E2EE messaging platforms make no technical sense whatsoever. It’s all just hand-waving on the part of the EU bureaucrats who are demanding it. They have no idea what E2EE really means. They just want to demand that a WhatsApp user should be able to send a message to someone on iMessage or Facebook Messenger. Just make it happen.

Who would run key exchange, and manage the discovery and distribution of said keys, for E2EE messages sent across platforms? Key exchange and discovery is essential, and a difficult problem to solve within each platform itself. I think it’s impossible across platforms. Within each platform, the platform owner is in charge and handles these things. With this EU fantasy of mandatory interop across messaging platforms, who would be in charge?

Apple getting exempted from this, I think, will mainly benefit Apple by letting them ignore an impossible mandate. I don’t think this interop will ever come to fruition, no matter what the EU demands, because I don’t think it can, nor do I think it should. Would be nice to just avoid the debate.

* You know.

Thieves Rob D.C. Uber Eats Driver, Steal Her Car, But Reject Android Phone 

Carl Willis, reporting for ABC 7News in Washington D.C.:

“As soon as he parked the car two masked gentlemen came up to him, armed,” she said. “They robbed him, took everything he had in his pockets, took the keys to my truck and got in and pulled off.”

She said one of them approached on foot in the 2400 block of 14th Street, NW. The other was in a black BMW, both of them armed with guns. She said the robbers were bold taking her husband’s phone, but then giving it back because it wasn’t to their liking.

“They basically looked at that phone and was like ‘Oh, that’s an Android? We don’t want this. I thought it was an iPhone,’” she said.

Leave the Android, take the cannoli.

Bending Spoons, the Parent Company That Now Owns — and Laid Off the Staff of — Filmic 

The Impassioned Moderate, a year ago:

News came out a few weeks ago that Bending Spoons, a consumer app studio, raised a massive $340 million round of financing. The press gushed about it: “Hollywood star, tech execs invest in Italian start-up Bending Spoons”, “Ryan Reynolds invests in ‘terrifying’ Italian start-up Bending”. And Ryan himself said things that are just so easy to imagine him saying (a testament to the spectacular job he’s done branding himself): “Their apps enable anyone to become a creative genius with minimum effort. In fact, their products terrify me so much, I had to invest.” (Ironically - or not? - his ad agency is called Maximum Effort…)

The problem? Bending Spoons is the one the most predatory actors on the entire App Store - they’re terrifying in a completely different way.

Bending Spoons’s business model is to buy successful apps, change them to a weekly auto-renewing subscription model that perhaps tricks users into signing up, and using the revenue to buy more apps and repeat the cycle. Filmic, for example, now defaults to a $3/week subscription — over $150/year. To be fair, there’s also a $40/year subscription.

It doesn’t seem like a scam, per se, but it doesn’t seem like a product-driven company. Apps seemingly don’t thrive after acquisition by Bending Spoons — instead, they get bled dry. There are some apps where a weekly subscription makes sense — Flighty comes to mind, for occasional travelers — but a camera app? Feels deceptive.

Bending Spoons is a big company with a lot of revenue and that spends a lot of money on App Store and Play Store search ads. (Here’s Tim Cook visiting their office last year.)

Kino: Forthcoming Video Camera App for iPhone From the Makers of Halide 

The timing is surely coincidental with regard to the news about Filmic, but, as they say, fortune favors the prepared.

Filmic’s Entire Staff Laid Off by Parent Company Bending Spoons 

Jaron Schneider, reporting for PetaPixel:

Filmic, or FiLMiC as written by the brand, no longer has any dedicated staff as parent company Bending Spoons has laid off the entire team including the company’s founder and CEO, PetaPixel has learned. Considered for years as the best video capture application for mobile devices, the team behind Filmic Pro and presumably Filmic Firstlight — the company’s photo-focused app — has been let go. [...]

It is unclear what Bending Spoons intends to do with Filmic Pro or Filmic Firstlight, but there were early signs of trouble when the company’s most recent major update was last year. The most recent notable update to Filmic Pro came in October which brought support for Apple Log into the app, but there was no mention of the addition of external SSD support, odd considering that Filmic Pro had a strong track record for updating its platform to work with all of the new iPhone updates — especially those that are particularly important for video.

In Filmic’s absence, Blackmagic Design’s iOS app has become the most popular way to capture footage with the new iPhones and was used by Apple’s in-house team for the production of its Mac event on October 31.

Christina Warren, on Threads:

Hate this but I’m sadly not at all surprised. Filmic has an incredible product they were afraid to charge for and when they finally changed pricing models, it was too little too late and users rebelled. If they had been charging $100 a year or even upfront in 2015, I think they could have survived without selling to the Bending Spoons vultures. But now they’ve got a subscription app that isn’t actively improving and free competition from Black Magic who uses their apps as loss leaders. Hate it.

Filmic was featured by Apple in numerous iPhone keynotes and App Store promotions over the years — for a long stretch it was undeniably the premier “pro” video camera app for iPhones.

India Is Considering EU-Style Charger Rules That Would Block Older iPhones From Sale 

Aditya Kalra and Munsif Vengattil, reporting for Reuters from New Delhi:

India wants to implement a European Union rule that will require smartphones to have a universal USB-C charging port, and has been in talks with manufacturers about introducing the requirement in India by June 2025, six months after the deadline in the EU. While all manufacturers including Samsung have agreed to India’s plan, Apple is pushing back. [...]

In a closed-door Nov. 28 meeting chaired by India’s IT ministry, Apple asked officials to exempt existing iPhone models from the rules, warning it will otherwise struggle to meet production targets set under India’s production-linked incentive (PLI) scheme, according to the meeting minutes seen by Reuters. [...]

In terms of market share, Apple accounts for 6% of India’s booming smartphone market, compared with just about 2% four years ago. Apple suppliers have expanded their facilities and make most iPhone 12, 13, 14 and 15 models in India for local sales and exports, Counterpoint Research estimates. Only iPhone 15 has the new universal charging port. Apple told Indian officials in the meeting that the “design of the earlier products cannot be changed,” the document showed.

Consumers in India’s price-conscious market prefer buying older models of iPhones which typically become cheaper with new launches, and India’s push for the common charger on older models could hit Apple’s targets, said Prabhu Ram, head of the Industry Intelligence Group at CyberMedia Research. “Apple’s fortunes in India have primarily been tied to older generation iPhones,” he said.

I was under the impression that the EU’s USB-C requirement will only apply to new devices, but maybe not? A plain reading of this EU press release suggests that all phones sold, starting in 2025, must have USB-C charging ports:

By the end of 2024, all mobile phones, tablets and cameras sold in the EU will have to be equipped with a USB Type-C charging port. From spring 2026, the obligation will extend to laptops.

That would mean, starting in January 2025, that the only iPhones available in the EU will be this year’s iPhones 15 and next year’s iPhones 16. A new fourth-generation iPhone SE with USB-C would give Apple a much-needed lower-priced model. The second-gen SE came in 2020; the current third-gen SE in 2022.

See also: Ben Lovejoy at 9to5Mac.

An AppleScript for Safari: Split Tabs to New Window 

I finally got around to scratching a longstanding itch. I’m an inveterate web browser tab hoarder, and a scenario I frequently encounter is wanting to move the most recent (typically, rightmost) tabs into a new window all by themselves. Let’s say, for example, I have 26 tabs open in the frontmost Safari window, A through Z. The current selected tab is X. This script will move tabs X, Y, and Z to a new window, leaving tabs A through W open in the old window. It starts with the current tab, and moves that tab and those to the right.

I have the script saved in my FastScripts scripts folder for Safari, but I tend to invoke it from LaunchBar (which I have configured to index my entire scripts folder hierarchy). Command-Space to bring up LaunchBar, type “spl” to select this script, hit Return, done.

I have no idea how many others might want this, but in recent years here at DF I’ve gotten away from sharing my occasional scripting hacks, and feel like I ought to get back to sharing them. Can’t let Dr. Drang have all the fun.

Update: Leon Cowle adapted my script to be more elegant and concise. If you’re using this but grabbed the script before 10:30pm ET, go back and re-grab it.

iCloud Advanced Data Protection Uptake Amongst DF Readers 

Back in August I ran a poll on Mastodon, asking my followers if they have iCloud Advanced Data Protection enabled. iCloud Advanced Data Protection was announced two years ago this week, alongside support for security keys (e.g. Yubico). The results, from 2,304 responses:

  • Yes: 29%
  • No: 59%
  • No, but would if not for device(s) with old OSes: 12%

Count me in that last group. I’ve got a handful of old devices that I still use which can’t be updated to an OS version that supports the feature. But one of these days I’ll just sign out of iCloud on those devices and enable this.

As ever when I run polls like this, it should go without saying that the Daring Fireball audience is not representative of the general public. The results for this poll — with nearly 30 percent of responders having an esoteric security feature enabled — exemplify that.

‘The Lost Voice’ 

One of Apple’s latest accessibility features is Personal Voice — for people who are “at risk of voice loss or have a condition that can progressively impact your voice”, Personal Voice lets you create a voice that sounds like you.

The Lost Voice is a two-minute short film directed by Taika Waititi celebrating this feature. It’s a splendid, heartwarming film, and it’s especially remarkable to see so much effort, such remarkable production values and filmmaking talent, being applied to marketing a feature for a tiny fraction of Apple’s users. Most people do not need this feature. But for those who do, it seems life-altering. Genuinely profound.

Apple at its very best.

See also: Shelly Brisbin at Six Colors.

First Trailer for ‘Grand Theft Auto VI’ 

Three thoughts:

  • I did not expect to hear a Tom Petty song in a GTA trailer, but I love it. It works. (Hard to escape the feeling though that the Petty estate is willing to sell songs in ways Petty himself wouldn’t have.)

  • The game looks amazing.

  • “Coming 2025”! Holy smokes, this game has been in development for a decade. (GTA 5 came out in late 2013 and has sold 190 million copies and generated over $8 billion.)

Software Applications Incorporated 

You’ve probably seen Infinite Mac, the web-based emulator of classic Mac OS, before. But Software Inc. — a new company from some of the people behind Workflow, which became Shortcuts after acquisition by Apple — used it to create their company website, and it’s delightful.

Kolide 

My thanks to Kolide for sponsoring last week at DF. Getting OS updates installed on end user devices should be easy. After all, it’s one of the simplest yet most impactful ways that every employee can practice good security. On top of that, every MDM solution promises that it will automate the process and install updates with no user interaction needed. Yet in the real world, it doesn’t play out like that. Users don’t install updates and IT admins won’t force installs via forced restart.

With Kolide, when a user’s device — be it Mac, Windows, Linux, or mobile — is out of compliance, Kolide reaches out to them with instructions on how to fix it. The user chooses when to restart, but if they don’t fix the problem by a predetermined deadline, they’re unable to authenticate with Okta.

Watch Kolide’s on-demand demo to learn more about how it enforces device compliance for companies with Okta.

The Talk Show: ‘The Blurry Edge of Acceptable’ 

Nilay Patel returns to the show. Topics include the iPhones 15, journalism in the age of AI, and what it’s like to have Barack Obama on your podcast.

Sponsored by:

  • Trade Coffee: Let’s coffee better. Get a free bag of fresh coffee with any Trade subscription.
  • Squarespace: Save 10% off your first purchase of a website or domain using code talkshow.
  • Nuts.com: The world’s best snacks, delivered fast and fresh.
Maybe It Was a Panoramic Photo 

Faruk Korkmaz posits a seemingly likely explanation for that “computational photography glitch in a bridal shop” photo: it was taken in Panoramic mode. The subject claims it wasn’t a Panoramic mode photo, but she didn’t snap the photo, and if a photo taken in Panoramic mode isn’t wide enough to reach some threshold, the Photos app does not identify/badge it as such. And conversely, a normal photograph cropped to a very wide aspect ratio will be badged as Panoramic — like this and this from my own library — even though it wasn’t snapped in Panoramic mode.

I think it’s quite likely Korkmaz is correct that this is the explanation for how this photo was created; I remain unconvinced that it wasn’t a deliberate publicity stunt.

‘Voice of a Star Wars Fan’ 

This is just an astonishing 20-minute film by Hiroshi Sumi. An homage and loving look back at the earliest days of Industrial Light and Magic. I don’t want to say much more than that lest I spoil the wonder of it. I don’t know why anyone would exert so much effort to make something like this but I’m so inordinately delighted that Sumi did. It speaks to the power of obsession.

After you watch it, take a look at this tweet from Sumi, and this prototype rendering from three years ago.

Just amazing. So much obvious love. (Via Todd Vaziri.)

CNBC Gets an Inside Look at an Apple Chip Lab 

Katie Tarasov, CNBC:

In November, CNBC visited Apple’s campus in Cupertino, California, the first journalists allowed to film inside one of the company’s chip labs. We got a rare chance to talk with the head of Apple silicon, Johny Srouji, about the company’s push into the complex business of custom semiconductor development, which is also being pursued by Amazon.

“We have thousands of engineers,” Srouji said. “But if you look at the portfolio of chips we do: very lean, actually. Very efficient.”

Can’t say there’s any news in this, but it’s neat to see inside the chip-testing lab. (Same video is available on YouTube, too, if that’s your jam.)

Amazon’s Fire TV Is Adding Full-Screen Video Ads That Play When You Start Your Fire TV 

Luke Bouma, writing for Cord Cutters:

Today, Cord Cutters News has confirmed that Amazon is adding full-screen video ads that will play when you start your Fire TV unless you quickly perform an action on it.

This new update will be rolling out to all Fire TVs made in 2016 or newer. With this update, the ad at the top of your Fire TV will now start playing full-screen, often promoting a movie or TV show. By hitting the home button, you can quickly exit the ad or if you quickly perform an action on the Fire TV once it finishes, you will avoid the video ad, but you only have a few seconds.

“Our focus is on delivering an immersive experience so customers can enjoy their favorite TV shows and movies, as well as browse and discover more content they’ll want to watch. We’re always working to make the Fire TV experience better for customers and have updated one of the prominent placements in the UI to play a short content preview if no other action is taken by a customer upon turning on their Fire TV.” Amazon said in a statement to Cord Cutters News.

What a load of horseshit from Amazon in that statement. Autoplaying ads aren’t “immersive”. And this is in no way “working to make the Fire TV experience better for customers”. Working to make things better would mean getting rid of shit like this, not adding it.

I really don’t understand how anyone uses anything but an Apple TV box. Apple TV is far from perfect but holy hell, it really does start from the perspective of respecting you, the user. The people at Apple who make it are obviously trying to create the experience that they themselves want when they’re watching TV at home.

Calling ‘Fake’ on the ‘iPhone Computational Photography Glitch in a Bridal Shop’ Viral Photo 

Wesley Hillard, self-described “Rumor Expert”, writing at AppleInsider:

A U.K. comedian and actor named Tessa Coates was trying on wedding dresses when a shocking photo of her was taken, according to her Instagram post shared by PetaPixel. The photo shows Coates in a dress in front of two mirrors, but each of the three versions of her had a different pose.

One mirror showed her with her arms down, the other mirror showed her hands joined at her waist, and her real self was standing with her left arm at her side. To anyone who doesn’t know better, this could prove to be quite a shocking image.

To the contrary, to anyone who “knows better”, this image clearly seems fake. But it’s a viral sensation:

Coates, in her Instagram description, claims “This is a real photo, not photoshopped, not a pano, not a Live Photo”, but I’m willing to say she’s either lying or wrong about how the photo was taken. Doing so feels slightly uncomfortable, given that the post was meant to celebrate her engagement, but I just don’t buy it. These are three entirely different arm poses, not three moments in time fractions of a second apart — and all three poses in the image are perfectly sharp. iPhone photography just doesn’t work in a way that would produce this image. I’d feel less certain this was a fake if there were motion blur in the arms in the mirrors. You can get very weird-looking photos from an iPhone’s Pano mode, but again, Coates states this is not a Pano mode image. (Perhaps you can generate an image like this using a Google Pixel 8’s Best Take feature, but this is purportedly from an iPhone, which doesn’t have a feature like that. And even with Best Take, that’s a feature you invoke manually, using multiple original images as input. I don’t think any phone camera, let alone an iPhone, produces single still images such as this.)

In a thread on Threads, where several commenters are rightfully skeptical:

  • Tyler Stalman (who hosts a great podcast on photography and videography):

    Any iPhone photographer can confirm that this is not an image processing error, it would never look like this.

  • David Imel (a writer/researcher for MKBHD):

    I really, REALLY do not think this is a real image. HDR on phones takes 5-7 frames with split-second exposure times. Whole process like .05 sec. Even a live photo is < 2 seconds.

    Even if the phone thought they were diff people it wouldn’t stitch like this and wouldn’t have time.

    This is spreading everywhere and it’s driving me insane.

I challenge anyone who thinks this is legit to produce such an image using an iPhone with even a single mirror in the scene, let alone two. If I’m wrong, let me know.

Update 1: Claude Zeins takes me up on my challenge.

Update 2: In a long-winded story post, Coates says she went to an Apple Store for an explanation and was told by Roger, the “grand high wizard” of Geniuses at the store, that Apple is “beta testing” a feature like Google’s Best Take. Which is not something Apple does, and if they did do, would require her to have knowingly installed an iOS beta.

Update 3: Best theory to date: it was, despite Coates’s claim to the contrary, taken in Panoramic mode.

Podcast App Castro Might Be Dying 

Jason Snell:

Castro has been a popular iOS podcast app for many years, but right now things look grim.

The cloud database that backs the service is broken and needs to be replaced. As a result, the app has broken. (You can’t even export subscriptions out of it, because even that function apparently relies on the cloud database.) “The team is in the progress of setting up a database replacement, which might take some time. We aim to have this completed ASAP,” said an Xtweet from @CastroPodcasts.

What’s worse, according to former Castro team member Mohit Mamoria, “Castro is being shut down over the next two months.”

I always appreciated Castro — it’s a well-designed, well-made app that embraced iOS design idioms. But as a user it just never quite fit my mental model for how a podcast client should work, in the way that Overcast does. I wanted to like Castro more than I actually liked it.

As a publisher, Castro was the 4th or 5th most popular client for The Talk Show for a while, but in recent years has slipped. Right now it’s 10th — but in a logarithmic curve. Overcast remains 1st; Apple Podcasts 2nd. The truth is, if not for Overcast, Castro would likely be in that top position, not shutting down. But Overcast does exist, and it’s the app where most people with exquisite taste in UI are listening to podcasts. There aren’t many markets where listeners of The Talk Show are in the core demographic, but iOS podcast apps are one. I can’t say why or precisely when, but somewhere along the line Castro lost its mojo.

I salute everyone who’s worked on it, though, because it really is a splendid app.

MacOS Security Prompts Need a Rethinking 

Jason Snell, writing at Six Colors:

Last month I wrote about how Apple’s cascade of macOS alerts and warnings ruin the Mac upgrade experience. [...]

This issue was brought home to me last week when I was reviewing the M3 iMac and the M3 MacBook Pro. As a part of reviewing those computers, I used Migration Assistant to move a backup of my Mac Studio to the new systems via a USB drive. Sometimes I try to review a computer with nothing migrated over, but it can be a real slowdown and I didn’t really have any time to spare last week.

Anyway, by migrating, I got to (twice) experience Apple’s ideal process of moving every user from one Mac to the next. You start up your new computer, migrate from a backup of the old computer, and then start using the new one. There’s a lot that’s great about this process, and it’s so much better than what we used to have to do to move files over from one Mac to another.

And yet all of Apple’s security alerts got in the way again and spoiled the whole thing. Here’s a screenshot I took right after my new Mac booted for the first time after migration.

I went through the exact same thing. Except if I had taken a screenshot of all the security-permission alerts I had to go though, there would have been more of them — and Snell’s screenshot looks like a parody. Back in the heyday of the “Get a Mac” TV ad campaign, Apple justifiably lambasted Windows Vista for its security prompts, but that’s exactly the experience you get after running Migration Assistant on a Mac today. It’s terrible.

Don’t get me wrong: Migration Assistant is borderline miraculous. It’s a wonderful tool that seemingly just keeps getting better. But MacOS itself stores too many security/privacy settings in a way that are tied to the device, not your user account. There ought to be some way to OK all these things in one fell swoop.

As Snell says, setting up a new Mac should be a joy, not a chore. Migration Assistant takes care of so much, but these cursed security prompts spoil the experience.

The Perils of Charging for Emergency Services 

Kyle Melnick, reporting last week for The Washington Post under the headline “A Toddler Was Taken in a Carjacking; VW Wanted $150 for GPS Coordinates, Lawsuit Says”:

Shepherd, who was four months pregnant, tried to fight off the man. But she was thrown to the pavement and run over by her own car as the man drove away with Isaiah in the back seat, authorities said. Shepherd thought she might never see her son again.

After Shepherd frantically called 911, investigators contacted Volkswagen’s Car-Net service, which can track the location of the manufacturer’s vehicles. They hoped to locate Isaiah.

But a customer service representative said that wouldn’t be possible because Shepherd’s subscription to the satellite service had expired, according to a new lawsuit. The employee said he couldn’t help until a $150 payment was made, the complaint said.

This perfectly illustrates the perils of Apple eventually charging for Emergency SOS satellite service. If Apple someday cuts off free service for compatible iPhones, eventually there’s going to be someone who dies because they chose not to pay to continue service. No one wants that.

Apple Extends Emergency SOS via Satellite for an Additional Free Year 

Apple Newsroom, two weeks ago:

One year ago today, Apple’s groundbreaking safety service Emergency SOS via satellite became available on all iPhone 14 models in the U.S. and Canada. Now also available on the iPhone 15 lineup in 16 countries and regions, this innovative technology — which enables users to text with emergency services while outside of cellular and Wi-Fi coverage — has already made a significant impact, contributing to many lives being saved. Apple today announced it is extending free access to Emergency SOS via satellite for an additional year for existing iPhone 14 users.

My hunch on this is that Apple would like to make this available free of charge in perpetuity, but wasn’t sure how much it would actually get used, and thus how much it would actually cost. If they come right out and say it’s free forever, then it needs to be free forever. It’s safer to just do what they’ve done here: make it free for an extra year one year at a time, and see how it goes as more and more iPhones that support the feature remain in active use.

It’s a wonderful feature — quite literally life-saving in numerous cases — but it’d be hard to sell. It’s like buying insurance. People like paying for stuff they want to use, not for stuff they hope they never need. Obviously, people do buy insurance — Apple itself, of course, sells AppleCare — but how many people would pay extra for Emergency SOS? If Apple can just quietly eat the cost of this service, they should, and I think will.

Charlie Munger, Warren Buffett’s Partner and Vice Chairman of Berkshire Hathaway, Dies at 99 

Andrew Ross Sorkin and Robert D. Hershey Jr., reporting for The New York Times:

Charles T. Munger, who quit a well-established law career to be Warren E. Buffett’s partner and maxim-spouting alter-ego as they transformed a foundering New England textile company into the spectacularly successful investment firm Berkshire Hathaway, died on Tuesday in Santa Barbara, Calif. He was 99.

His death, at a hospital, was announced by Berkshire Hathaway. He had a home in Los Angeles.

Although overshadowed by Mr. Buffett, who relished the spotlight, Mr. Munger, a billionaire in his own right — Forbes listed his fortune as $2.6 billion this year — had far more influence at Berkshire than his title of vice chairman suggested.

Mr. Buffett has described him as the originator of Berkshire Hathaway’s investing approach. “The blueprint he gave me was simple: Forget what you know about buying fair businesses at wonderful prices; instead, buy wonderful businesses at fair prices,” Mr. Buffett once wrote in an annual report. [...]

A $1,000 investment in Berkshire made in 1964 is worth more than $10 million today.

Mr. Munger was often viewed as the moral compass of Berkshire Hathaway, advising Mr. Buffett on personnel issues as well as investments. His hiring policy: “Trust first, ability second.”

A new edition of Munger’s book of aphorisms, Poor Charlie’s Almanack — its title an allusion to Munger’s idol, Benjamin Franklin — is due next week.

WSJ Reports That Apple and Goldman Sachs Are Parting Ways on Apple Card 

AnnaMaria Andriotis, reporting for The Wall Street Journal (News+):

Apple is pulling the plug on its credit-card partnership with Goldman Sachs, the final nail in the coffin of the Wall Street bank’s bid to expand into consumer lending.

The tech giant recently sent a proposal to Goldman to exit from the contract in the next roughly 12-to-15 months, according to people briefed on the matter. The exit would cover their entire consumer partnership, including the credit card the companies launched in 2019 and the savings account rolled out this year.

It couldn’t be learned whether Apple has already lined up a new issuer for the card.

Apple Card is a strange product — everyone I know who has one likes it (including me), but Goldman itself has reported that they’ve lost $3 billion since 2020 on it. The savings accounts are a hit with customers too.

American Express is rumored to be one possible partner, but it would be pretty strange for Apple Cards to transmogrify from MasterCard to Amex cards overnight. There are still a lot of businesses — particularly throughout Europe — that accept MasterCard but not Amex. It’s not just that Apple Card would no longer be accepted at businesses where previously it was, but that would highlight the fact that Apple Card is really just an Apple-branded card issued by a company that isn’t Apple. Apple wants you to think of Apple Card as, well, an Apple credit card.


More on Sam Altman’s Ouster From OpenAI

Kevin Roose, reporting for The New York Times:

An all-hands meeting for OpenAI employees on Friday afternoon didn’t reveal much more. Ilya Sutskever, the company’s chief scientist and a member of its board, defended the ouster, according to a person briefed on his remarks. He dismissed employees’ suggestions that pushing Mr. Altman out amounted to a “hostile takeover” and claimed it was necessary to protect OpenAI’s mission of making artificial intelligence beneficial to humanity, the person said.

Mr. Altman appears to have been blindsided, too. He recorded an interview for the podcast I co-host, “Hard Fork,” on Wednesday, two days before his firing. During our chat, he betrayed no hint that anything was amiss, and he talked at length about the success of ChatGPT, his plans for OpenAI and his views on A.I.’s future.

Mr. Altman stayed mum about the precise circumstances of his departure on Friday. But Greg Brockman — OpenAI’s co-founder and president, who quit on Friday in solidarity with Mr. Altman — released a statement saying that both of them were “shocked and saddened by what the board did today.” Mr. Altman was asked to join a video meeting with the board at noon on Friday and was immediately fired, Mr. Brockman said.

Kara Swisher was all over the story last night, writing on Twitter/X:

Sources tell me that the profit direction of the company under Altman and the speed of development, which could be seen as too risky, and the nonprofit side dedicated to more safety and caution were at odds. One person on the Sam side called it a “coup,” while another said it was the the right move. [...]

More: The board members who voted against Altman felt he was manipulative and headstrong and wanted to do what he wanted to do. That sounds like a typical SV CEO to me, but this might not be a typical SV company. They certainly have a lot of explaining to do.

According to Brockman — who until he quit in protest of Altman’s firing was chairman of the OpenAI board — he didn’t find out until just 5 minutes before Altman was sacked. I’ve never once heard of a corporate board firing the company’s CEO behind the back of the chairman of the board.

It really does look more and more like a deep philosophical fissure inside OpenAI, between those led by Sutskever (and, obviously, a majority of the board) advocating a cautious slow and genuinely non-profit-driven approach, and Altman/Brockman’s “let’s move fast, change the world, and make a lot of money” side. Sutskever and the OpenAI board seemingly see Altman/Brockman as reckless swashbucklers; Altman and Brockman, I suspect, see Sutskever and his side as a bunch of ninnies.

A simple way to look at it is to read OpenAI’s charter, “the principles we use to execute on OpenAI’s mission”. It’s a mere 423 words, and very plainly written. It doesn’t sound anything at all like the company Altman has been running. The board, it appears, believes in the charter. How in the world it took them until now to realize Altman was leading OpenAI in directions completely contrary to their charter is beyond me.

It’s like the police chief in Casablanca being “shocked — shocked!” to find out that gambling was taking place in a casino where he played


Qualcomm’s Awkward Boasting Regarding Its Forthcoming X Elite Platform

Monica Chin, reporting for The Verge last month, “Qualcomm Claims Its Snapdragon X Elite Processor Will Beat Apple, Intel, and AMD”:

Qualcomm has announced its new Snapdragon X Elite platform, which looks to be its most powerful computing processor to date. The chips (including the new Qualcomm Oryon, announced today) are built on a 4nm process and include 136GB/s of memory bandwidth. PCs are expected to ship in mid-2024. [...]

Oh, Qualcomm also claims that its chip will deliver “50% faster peak multi-thread performance” than Apple’s M2 chip. This is just a funny claim; the X Elite has 50 percent more cores than the M2 and sucks down much more power, so of course it is going to do better on Geekbench at “peak multi-thread performance.” That’s like a professional sprinter bragging about winning the 100-meter dash against a bunch of marathon champions.

This news is so old that Chin is no longer on the staff at The Verge (which I think explains why she didn’t write either of their reviews for the new M3 MacBook Pros), but I’m cleaning up old tabs and wanted to comment on this.

It’s nonsense. Chips that aren’t slated to appear in any actual laptops until “mid-2024” are being compared to the M2, which Apple debuted with the MacBook Air in June 2022. So even if Qualcomm’s performance claims are true and PCs based on their chips ship on schedule, they’re comparing against a chip that Apple debuted two entire years earlier.

Plus they’re only comparing multi-core performance against the base M2. And they’re not really comparing multi-core performance overall but “peak” performance, however it is they define that. And the fact that they only mention multi-core performance strongly suggests that they’re slower than the M2 at single-core performance, which for most consumer/prosumer use cases is more important.

And: No one in the PC world seems to care about ARM chips, at least for laptops. Microsoft made a go of it with their Surface line and largely gave up. My understanding is that fewer than 1 percent of PC sales today are ARM-based machines. If Microsoft wasn’t willing to optimize Windows to make it ARM-first, or even treat ARM as an equal to x86, when they themselves were trying to make ARM-based Windows laptops a thing, why would they do it now?

If Mac hardware and MacOS were made by separate companies, and the MacOS software company licensed their OS to other OEMs, I really don’t think Apple silicon hardware would have happened. The seemingly too-good-to-be-true performance of Apple silicon Macs is the result of the silicon being designed for the software and the software being optimized and at very low levels designed for the silicon. Qualcomm isn’t going to get that from Microsoft with Windows.

Qualcomm’s X Elite platform may well beat Intel and AMD, but I’m not sure that will matter in the PC world unless Microsoft truly goes all-in on ARM with Windows. Which I don’t see happening. But the idea that they’re even vaguely catching up to Apple silicon is laughable, and it’s frustrating that so much of the tech press took anything Qualcomm claimed about relative performance against Apple silicon seriously.

We know for a fact that their Snapdragon chips for phones have always lagged years behind Apple’s A-series chips in both sheer performance and performance-per-watt, with no sign that they’re catching up. So how in the world would their ARM chips for PCs beat Apple’s M-series chips?

And, yes, I predicted this back in November 2021, when Qualcomm claimed they’d be shipping “M-series competitive” chips for PCs by 2023. Qualcomm claimed to still be on track to ship in 2023 just one year ago, so I wouldn’t hold my breath for “mid-2024” either. 


Vision Pro, Spatial Video, and Panoramic Photos

Yesterday Apple released developer beta 2 of iOS 17.2, the first version of iOS to include support for capturing spatial video with iPhone 15 Pro models. Today came the public beta, enabling the same feature. Apple invited me to New York yesterday, not merely to preview capturing spatial video using an iPhone, but to experience watching those spatial videos using a Vision Pro.

The experience was, like my first Vision Pro demo back at WWDC in June, astonishing.

Capturing Spatial Video on an iPhone

Shooting spatial video on an iPhone 15 Pro is easy. The feature is — for now at least — disabled by default, and can be enabled in Settings → Camera → Formats. The option is labeled “Spatial Video for Apple Vision Pro”, which is apt, because (again, for now at least) spatial video only looks different from non-spatial video when playing it back on Vision Pro. When viewed on any other device it just looks like a regular flat video.

Once enabled, you can toggle spatial video capture in the Camera app whenever you’re in the regular Video mode. It’s very much akin to the toggle for Live Photos when taking still photos, or the Action mode toggle for video — not a separate mode in the main horizontal mode switcher in Camera, but a toggle button in the main Video mode.

When capturing spatial video, you have no choice regarding resolution, frame rate, or file format. All spatial video is captured at 1080p, 30 fps, in the HEVC file format.1 You also need to hold the phone horizontally, to put the two capturing lenses on the same plane. The iPhone uses the main (1×) and ultra wide (0.5×) lenses for capture when shooting spatial video, and in fact, Apple changed the arrangement of the three lenses on the iPhone 15 Pro in part to support this feature. (On previous iPhone Pro models, when held horizontally, the ultra wide camera was on the bottom, and the main and telephoto lenses were next to each other on the top.)

I believe resolution is limited to 1080p because to get an image from the ultra wide 0.5× camera that is the equivalent field of view as from the main 1× camera, it needs to crop the ultra wide image significantly. The ultra wide camera has a mere 12 MP sensor, so there just aren’t enough pixels to crop a 1× equivalent field of view from the center of the sensor and get a 4K image.

There are two downsides to shooting spatial video with your iPhone. First, the aforementioned 1080p resolution and 30 fps frame rate. I’ve been shooting 4K video by default for years, because why not? I wish we could capture spatial video at 4K, but alas, not yet. The second downside to shooting spatial video is that it effectively doubles the file size compared to non-spatial 1080p, for the obvious reason that each spatial video contains two 1080p video streams. That file-size doubling is a small price to pay — the videos are still smaller than non-spatial 4K 30 fps video.2

Really, it’s just no big deal to capture spatial video on your iPhone. If the toggle button is off, you capture regular video with all the regular options for resolution (720p/1080p/4K) and frame rates (24/30/60). If the toggle for spatial video is on — and when on it’s yellow, impossible to miss — you lose those choices but it just looks like capturing a regular video. And when you play it back, or share it with others who are viewing it on regular devices like their phones or computers, it just looks like a regular flat video.

If you own an iPhone 15 Pro, there’s no good reason not to start capturing spatial videos this year — like, say, this holiday season — to record any sort of moments that feel like something you might want to experience as “memories” with a Vision headset in the future, even if you don’t plan to buy the first-generation Vision Pro next year.

Viewing Spatial Video on Vision Pro

Before my demo, I provided Apple with my eyeglasses prescription, and the Vision Pro headset I used had appropriate corrective lenses in place. As with my demo back in June, everything I saw through the headset looked incredibly sharp.

Apple has improved and streamlined the onboarding/calibration process significantly since June. There are a few steps where you’re presented with a series of dots in a big circle floating in front of you, like the hour indexes on a clock. As you look at each circle, it lights up a bit, and you do the finger tap gesture. It’s the Vision Pro’s way to calibrate that what it thinks you’re looking is what you actually are looking at. Once that calibration step was over — and it took just a minute or two — I was in, ready to go on the home screen of VisionOS. (And the precision of this calibration is amazing — UI elements can be placed relatively close to each other and it knows exactly which one you’re looking at when you tap. iOS UI design needs to be much more forgiving of our relatively fat fingertips than VisionOS UI design needs to be about the precision of our gaze.)

My demo yesterday was expressly limited to photography in general, and spatial video in particular, and so my demo was, per Apple’s request, limited to the Photos app in VisionOS. It was tempting, at times, to see where else I could go and what else I could do. But there was so much to see and do in Photos alone that my demo — about 30 minutes in total wearing Vision Pro — raced by.

Prior to separating us in smaller rooms for our time using Vision Pro, I was paired with Joanna Stern from The Wall Street Journal for a briefing on the details of spatial video capturing using the iPhones 15 Pro. We were each provided with a demo iPhone, and were allowed to capture our own spatial videos. We were in a spacious modern high-ceiling’d kitchen, bathed in natural light from large windows. A chef was busy preparing forms of sushi, and served as a model for us to shoot. Joanna and I also, of course, shot footage of each other, while we shot each other. It was very meta. The footage we captured ourselves was then preloaded onto the respective Vision Pro headsets we used for our demos.

We were not permitted to capture spatial video on Vision Pro.3 However, our demo units had one video in the Photos library that was captured on Vision Pro — a video I had experienced before, back in June, of a group of twenty-somethings sitting around a fire pit at night, having fun in a chill atmosphere. There were also several other shot-by-Apple spatial videos which were captured using an iPhone 15 Pro.

One obvious question: How different do spatial videos captured using iPhone 15 Pro look from those captured using a Vision Pro itself? Given that Apple provided only one example spatial video captured on Vision Pro, I don’t feel like I can fully answer that based on my experience yesterday. It did not seem like the differences were dramatic or significant. The spatial videos shot using iPhone 15 Pro that I experienced, including those I captured myself, seemed every bit as remarkable as the one captured using Vision Pro.

Apple won’t come right out and say it but I do get the feeling that all things considered, spatial video captured using Vision Pro will be “better”. The iPhone might win out on image quality, given the fact that the 1× main camera on the iPhone 15 Pro is the single best camera system Apple makes, but the Vision Pro should win out on spatiality — 3D-ness — because Vision Pro’s two lenses for spatial video capture are roughly as far apart from each other as human eyes. The two lenses used for capture on an iPhone are, of course, much closer to each other than any pair of human eyes. But despite how close the two lenses are to each other, the 3D effect is very compelling on spatial video captured on an iPhone. It’s somehow simultaneously very natural-looking and utterly uncanny.

It’s a stunning effect and remarkable experience to watch them. And so the iPhone, overall, is going to win out as the “best” capture device for spatial video — even if footage captured on Vision Pro is technically superior — because, as the old adage states, the best camera is the one you have with you. I have my iPhone with me almost everywhere. That will never be even close to true for Vision Pro next year.

Here’s what I wrote about spatial video back in June, after my first hands-on time with Vision Pro:

Spatial photos and videos — photos and videos shot with the Vision Pro itself — are viewed as a sort of hybrid between 2D content and fully immersive 3D content. They don’t appear in a crisply defined rectangle. Rather, they appear with a hazy dream-like border around them. Like some sort of teleportation magic spell in a Harry Potter movie or something. The effect reminded me very much of Steven Spielberg’s Minority Report, in the way that Tom Cruise’s character could obsessively watch “memories” of his son, and the way the psychic “precogs” perceive their visions of murders about to occur. It’s like watching a dream, but through a portal opened into another world.

When you watch regular (non-spatial) videos using Vision Pro, or view regular still photography, the image appears in a crisply defined window in front of you. Spatial videos don’t appear like that at all. I can’t describe it any better today than I did in June: it’s like watching — and listening to — a dream, through a hazy-bordered portal opened into another world.

Several factors contribute to that dream-like feel. Spatial videos don’t look real. It doesn’t look or feel like the subjects are truly there in front of you. That is true of the live pass-through video you see in Vision Pro, of the actual real world around you. That pass-through video of actual reality is so compelling, so realistic, that in both my demo experiences to date I forgot that I was always looking at video on screens in front of my eyes, not just looking through a pair of goggles with my eyes’ own view of the world around me.

So Vision Pro is capable of presenting video that looks utterly real — because that’s exactly how pass-through video works and feels. Recorded spatial videos are different. For one thing, reality is not 30 fps, nor is it only 1080p. This makes spatial videos not look low-resolution or crude, per se, but rather more like movies. The upscaled 1080p imagery comes across as film-like grain, and the obviously-lower-than-reality frame rate conveys a movie-like feel as well. Higher resolution would look better, sure, but I’m not sure a higher frame rate would. Part of the magic of movies and TV is that 24 and 30 fps footage has a dream-like aspect to it.

Nothing you’ve ever viewed on a screen, however, can prepare you for the experience of watching these spatial videos, especially the ones you will have shot yourself, of your own family and friends. They truly are more like memories than videos. The spatial videos I experienced yesterday that were shot by Apple looked better — framed by professional photographers, and featuring professional actors. But the ones I shot myself were more compelling, and took my breath away. There’s my friend, Joanna, right in front of me — like I could reach out and touch her — but that was 30 minutes ago, in a different room.

Prepare to be moved, emotionally, when you experience this.

Panoramic and Still Photos

My briefing and demo experience yesterday was primarily about capturing spatial video on iPhone 15 Pro and watching it on Vision Pro, but my demo went through the entire Photos app experience in VisionOS.

Plain old still photos look amazing. You can resize the virtual window in which you’re viewing photos to as large as you can practically desire. It’s not merely like having a 20-foot display — a size far more akin to that of a movie theater screen than a television. It’s like having a 20-foot display with retina quality resolution, and the best brightness and clarity of any display you’ve ever used. I spend so much time looking at my own iPhone-captured still photos on my iPhone display that it’s hard to believe how good they can look blown up to billboard-like dimensions. Just plain still photos, captured using an iPhone.

And then there are panoramic photos. Apple first introduced Pano mode back in 2012, with the introduction of the iPhone 5. That feature has never struck me as better than “Kind of a cool trick”. In the decade since the feature has been available, I’ve only taken about 200 of them. They just look too unnatural, too optically distorted, when viewed on a flat display. And the more panoramic you make them, the more unnatural they look when viewed flat.

Panoramic photos viewed using Vision Pro are breathtaking.

There is no optical distortion at all, no fish-eye look. It just looks like you’re standing at the place where the panoramic photo was taken — and the wider the panoramic view at capture, the more compelling the playback experience is. It’s incredible, and now I wish I’d spent the last 10 years taking way more of them.

As a basic rule, going forward, I plan to capture spatial videos of people, especially my family and dearest friends, and panoramic photos of places I visit. It’s like teleportation.

Miscellaneous Observations

The Vision Pro experience is highly dependent upon foveated rendering, which Wikipedia succinctly describes as “a rendering technique which uses an eye tracker integrated with a virtual reality headset to reduce the rendering workload by greatly reducing the image quality in the peripheral vision (outside of the zone gazed by the fovea).” Our retinas work like this too — we really only see crisply what falls on the maculas at the center of our retinas. Vision Pro really only renders at high resolution what we are directly staring at. The rest is lower-resolution, but that’s not a problem, because when you shift your gaze, Vision Pro is extraordinarily fast at updating the display.

I noticed yesterday that if I darted my eyes from one side to the other fast enough, I could sort of catch it updating the foveation. Just for the briefest of moments, you can catch something at less than perfect resolution. I think. It is so fast fast fast at tracking your gaze and updating the displays that I can’t be sure. It’s just incredible, though, how detailed and high resolution the overall effect is. My demo yesterday was limited to the Photos app, but I came away more confident than ever that Vision Pro is going to be a great device for reading and writing — and thus, well, work.

The sound quality of the speakers in the headset strap is impressive. The visual experience of Vision Pro is so striking — I mean, the product has “Vision” in its name for a reason — that the audio experience is easy to overlook, but it’s remarkably good.

Navigating VisionOS with your gaze and finger taps is so natural. I’ve spent a grand total of about an hour, spread across two 30-minute demos, using Vision Pro, but I already feel at home using the OS. It’s an incredibly natural interaction model based simply on what you are looking at. My enthusiasm for this platform, and the future of spatial computing, could not be higher. 


  1. The HEVC spec allows for a single file to contain multiple video streams. That’s what Apple is doing, with metadata describing which stream is “left” and which is “right”. Apple released preliminary documentation for this format back in June, just after WWDC. ↩︎

  2. According to Apple, these are the average file sizes per minute of video:

    • Regular 1080p 30 fps: 65 MB
    • Spatial 1080p 30 fps: 130 MB
    • Regular 4K 30 fps: 190 MB ↩︎︎

  3. Paraphrased:

    “This is the digital crown. You’ll be using this today to adjust the immersiveness by turning it, and you’ll press the crown if you need to go back to the home screen. On the other side is the button you would use to capture photos or videos using Vision Pro. We won’t be using that button today.”

    “But does that button work? If I did press that button, would it capture a photo or video?”

    “Please don’t press that button.” ↩︎︎