By John Gruber
WorkOS: APIs to ship SSO, SCIM, FGA, and User Management in minutes. Check out their launch week.
Kyle Chayka, writing for The New Yorker:
In January, I traded my iPhone 7 for an iPhone 12 Pro, and I’ve been dismayed by the camera’s performance. On the 7, the slight roughness of the images I took seemed like a logical product of the camera’s limited capabilities. I didn’t mind imperfections like the “digital noise” that occurred when a subject was underlit or too far away, and I liked that any editing of photos was up to me. On the 12 Pro, by contrast, the digital manipulations are aggressive and unsolicited. One expects a person’s face in front of a sunlit window to appear darkened, for instance, since a traditional camera lens, like the human eye, can only let light in through a single aperture size in a given instant. But on my iPhone 12 Pro even a backlit face appears strangely illuminated. The editing might make for a theoretically improved photo — it’s nice to see faces — yet the effect is creepy. When I press the shutter button to take a picture, the image in the frame often appears for an instant as it did to my naked eye. Then it clarifies and brightens into something unrecognizable, and there’s no way of reversing the process. David Fitt, a professional photographer based in Paris, also went from an iPhone 7 to a 12 Pro, in 2020, and he still prefers the 7’s less powerful camera. On the 12 Pro, “I shoot it and it looks overprocessed,” he said. “They bring details back in the highlights and in the shadows that often are more than what you see in real life. It looks over-real.”
Chayka’s is an interesting take, for sure. He references Halide’s aforelinked deep analysis of the iPhone 13 Pro camera system (which is what reminded me to link to it) thus:
Yet, for some users, all of those optimizing features have had an unwanted effect. Halide, a developer of camera apps, recently published a careful examination of the 13 Pro that noted visual glitches caused by the device’s intelligent photography, including the erasure of bridge cables in a landscape shot. “Its complex, interwoven set of ‘smart’ software components don’t fit together quite right,” the report stated.
That shot of the bridge was not a good result, but it wasn’t emblematic of the typical iPhone 13 camera experience in any way. I don’t think Chayka is being overly disingenuous, but for 99 percent of the photos taken by 99 percent of people (ballpark numbers, obviously) the iPhone 12 or 13 is a way better camera than an iPhone 7. Yet Chayka might leave some readers thinking they’re going to get better photos from a six-year-old iPhone, which simply isn’t true.
The problem is not that iPhone cameras have gotten too smart. It’s that they haven’t gotten smart enough. There most certainly are trade-offs between old-fashioned dumb photography and today’s state-of-the-art computational photography, but those trade-offs overwhelmingly favor computational photography. Chayka’s whole argument feels somewhat like arguments that shooting on film produced superior results compared to digital sensors circa 15 years ago.
★ Tuesday, 22 March 2022