Calling ‘Fake’ on the ‘iPhone Computational Photography Glitch in a Bridal Shop’ Viral Photo

Wesley Hillard, self-described “Rumor Expert”, writing at AppleInsider:

A U.K. comedian and actor named Tessa Coates was trying on wedding dresses when a shocking photo of her was taken, according to her Instagram post shared by PetaPixel. The photo shows Coates in a dress in front of two mirrors, but each of the three versions of her had a different pose.

One mirror showed her with her arms down, the other mirror showed her hands joined at her waist, and her real self was standing with her left arm at her side. To anyone who doesn’t know better, this could prove to be quite a shocking image.

To the contrary, to anyone who “knows better”, this image clearly seems fake. But it’s a viral sensation:

Coates, in her Instagram description, claims “This is a real photo, not photoshopped, not a pano, not a Live Photo”, but I’m willing to say she’s either lying or wrong about how the photo was taken. Doing so feels slightly uncomfortable, given that the post was meant to celebrate her engagement, but I just don’t buy it. These are three entirely different arm poses, not three moments in time fractions of a second apart — and all three poses in the image are perfectly sharp. iPhone photography just doesn’t work in a way that would produce this image. I’d feel less certain this was a fake if there were motion blur in the arms in the mirrors. You can get very weird-looking photos from an iPhone’s Pano mode, but again, Coates states this is not a Pano mode image. (Perhaps you can generate an image like this using a Google Pixel 8’s Best Take feature, but this is purportedly from an iPhone, which doesn’t have a feature like that. And even with Best Take, that’s a feature you invoke manually, using multiple original images as input. I don’t think any phone camera, let alone an iPhone, produces single still images such as this.)

In a thread on Threads, where several commenters are rightfully skeptical:

  • Tyler Stalman (who hosts a great podcast on photography and videography):

    Any iPhone photographer can confirm that this is not an image processing error, it would never look like this.

  • David Imel (a writer/researcher for MKBHD):

    I really, REALLY do not think this is a real image. HDR on phones takes 5-7 frames with split-second exposure times. Whole process like .05 sec. Even a live photo is < 2 seconds.

    Even if the phone thought they were diff people it wouldn’t stitch like this and wouldn’t have time.

    This is spreading everywhere and it’s driving me insane.

I challenge anyone who thinks this is legit to produce such an image using an iPhone with even a single mirror in the scene, let alone two. If I’m wrong, let me know.

Update 1: Claude Zeins takes me up on my challenge.

Update 2: In a long-winded story post, Coates says she went to an Apple Store for an explanation and was told by Roger, the “grand high wizard” of Geniuses at the store, that Apple is “beta testing” a feature like Google’s Best Take. Which is not something Apple does, and if they did do, would require her to have knowingly installed an iOS beta.

Update 3: Best theory to date: it was, despite Coates’s claim to the contrary, taken in Panoramic mode.

Friday, 1 December 2023