By John Gruber
Jiiiii — All your anime stream schedules in one place.
Three years ago, Phil Schiller was my guest on stage at The Talk Show Live From WWDC 2015. Nearing the end (skip ahead to just before the 48:00 mark in the video), and wanting to talk about the iPhone and photography, I said, “I think it’s so clear — and the ‘Shot with iPhone’ marketing campaign shows that you guys clearly believe it too — that Apple has become one of, if not the, leading camera companies in the world.”
Before I could get to a question, Schiller jumped in.
“The,” he said emphatically.
That moment, more than any other in that interview, really stuck with me. Go to Apple.com and look at the products atop the page: Mac, iPad, iPhone, Watch, TV, Music. Of course Phil Schiller thinks Apple is the world’s leading company in all those categories. I’d venture to say almost everyone in the audience for that show would have agreed with that. But to say Apple is the leading camera company in the world, full stop, wasn’t just about comparing Apple to other phone makers. It was about comparing Apple to companies like Canon, Nikon, Sony, Fujifilm, and Leica. And trust me, Schiller — an avid hobbyist photographer — understands exactly what those companies’ cameras are capable of. His emphatic, instant confidence in that statement — that in June 2015 Apple was already the leading camera company in the world — made me think one thing:
I wish I knew what he knows about the iPhone camera pipeline for the next few years.
The last four years, I’ve coyly titled my iPhone reviews “The iPhones 6”, “The iPhones 6S”, “The iPhones 7”, and “The iPhones 8”. That’s not how most people would pluralize these iPhone pairs (but some would — there’s some legitimate precedent with pluralizations like “mothers-in-law” and “attorneys general” where the adjective comes after the noun).
My thinking with the first of these — the iPhone 6 generation — was that I didn’t want to use the title “The iPhone 6 and 6 Plus”, because the year prior I’d used the title “The iPhone 5S and 5C”. That title fit, because the 5S and 5C were definitely different phones — the 5S had a new A7 chip (the first to go 64-bit), Touch ID, and a new camera. The 5C was basically the year-old iPhone 5 in new colorful plastic bodies. The iPhone 6 and 6 Plus were different, but clearly of the same generation. I thought the “The iPhones 6” formulation helped emphasize that. Secondarily, I didn’t want to use the headline “The iPhone 6s” (as a plural) because there seemed to me a very good chance Apple would use the name iPhone 6S (singular) the next year — and I was right. And I didn’t want to use “The iPhone 6’s” (as a plural) because even though it’s acceptable to use an apostrophe followed by an S to pluralize letters and numbers,1 it doesn’t look good.
I’ve been testing an iPhone XS and XS Max — both in gold — since Wednesday evening last week. I spent the first few days mainly with the XS Max, and the remainder mainly with the XS. This year, I strongly considered titling this review “The iPhone XS”. Not to ignore the XS Max, but because I honestly think it’s best to think of them as two sizes of the same iPhone, not two separate iPhones — in the same way we treat color options. Pick your size, pick your color. It makes no more sense to review the iPhone XS and XS Max as different devices than it would to write separate reviews of medium and large cups of the same coffee. (There is no small coffee.)
Exact same A12 system-on-a-chip. Exact same cameras. In my own testing I have seen no discernible difference in performance, display quality, or the cameras’ photo or video quality. Maybe the XS Max is a little louder when you play audio through its speakers? If so, it’s not by much.
The only practical differences in the hardware are that the XS Max has — duh — a larger display (same exact quality as the XS, but more pixels, and thus displays more information on screen) and a larger battery. And in software, only the XS Max gets iPad-style two-column layouts in landscape orientation in apps like Messages and Mail.2 Those are meaningful differences, so “The iPhones XS” it is.
Coming from any previous iPhone from the Plus era, if you preferred the Plus size, I can’t see why you wouldn’t prefer the XS Max now. It’s almost exactly the same height and width as the 6-8 Plus devices, but about a millimeter smaller in both dimensions. That shouldn’t make much of a difference, but I swear, in my hand, it actually somehow feels more comfortable than an iPhone Plus. I don’t know if that’s a steel vs. aluminum thing or just the reviewer’s placebo effect.
Personally, I prefer the XS. But it was a closer call for me than in previous years.
For the remainder of this review, I’m mostly going to talk about the “iPhone XS”, but everything I say pertains to both sizes.
Take a look at Apple’s nifty three-column iPhone Comparison web page. It defaults to comparing the XS, XS Max, and XR.3 Change that third column to last year’s iPhone X, and the differences from top to bottom mostly look rather mild. Part of that is that most of the year-over-year improvements are rather mild. Water resistance, for example, went from IP67 to IP68 — from 1 meter for up to 30 minutes to 2 meters for up to 30 minutes. Nice, but not wow.
But there is one wow factor comparing the iPhone XS to last year’s iPhone X: photography. But the reasons don’t show up in Apple’s comparison spec list (even though some of them could). I’ve focused nearly the entirety of my testing on taking photos and videos side-by-side against my 10-month old iPhone X. Overall, I’m simply blown away by the iPhone XS’s results. Sometimes the difference is subtle but noticeable; sometimes the difference is between unusable and pretty good. The iPhone XS can capture still images and video that the iPhone X cannot.
It’s worth emphasizing — as I do every year — that normal people do not upgrade their phones after a single year. Most don’t upgrade after two years. They upgrade when their old phone breaks or gets too slow. Anyone upgrading to the iPhone XS from an iPhone 7 or older is getting a great upgrade in dozens of ways, and the camera system is just one of them. I’ve concentrated on comparing the iPhone XS’s camera to the iPhone X’s for two reasons. First, even though most people don’t buy iPhones annually, Apple releases a new generation of iPhones annually, so that year-over-year comparison feels like the natural way to measure their progress. Second, a fair number of people do upgrade annually, or at least consider it (enough people that Apple deemed it worthwhile to create an annual upgrade program), and for the people who own an iPhone X who are considering an upgrade to the XS, to my mind, the camera system is the one and only reason to do it. There are always edge cases. Someone who is a frequent international traveler might consider it worth upgrading just to get the dual SIM support. I’m sure some number of iPhone X owners will upgrade just to get the gold model. But for most people, I’m convinced the camera system is the reason to think about it.
A cynic might argue that the reason Apple spent so much time talking about photography and the camera system (which includes the Apple Neural Engine) is that it’s all they had to talk about this year. I would argue they spent a lot of time talking about photography because there’s a lot to talk about. In fact, I think Apple left out some remarkable aspects of the iPhone XS camera system from the keynote and their website.
Apple didn’t leave this part out. Computational photography and the A12’s vastly improved Neural Engine are central to Apple’s pitch for the iPhone XS camera system. Based on the photos and videos I’ve shot, I believe them.
At a low level the Apple Neural Engine is way beyond my ken. I understand fundamentally how a CPU works. I sort of understand how modern GPUs are much faster than CPUs at certain computations which don’t necessarily pertain to rendering “graphics”. I have no idea how a neural engine works. All I know is it can be seemingly impossibly faster than a CPU or GPU at executing a machine learning model.
The iPhone XS has a seriously improved wide-angle camera. Just in terms of pure old-fashioned optics — light passing through a lens onto a sensor. More — perhaps too much more — on that later. But the iPhone XS has captured images for me that I’m certain can’t be explained by optics alone.
HDR has been around for a long time, and for years on the iPhone. It’s basically pretty simple: in difficult lighting conditions (harsh backlighting for example) HDR combines multiple exposures into a single image. In years past I generally turned HDR off on the iPhone. It was too hit and miss. That’s why up until last year, iPhones defaulted to keeping a normally exposed image alongside HDR images. But that was a pain in the ass, too — you’d wind up with two images in your photo roll for every picture you took. Even when the HDR image was better, you’d still have the non-HDR version to throw away.
On the iPhone XS Apple is touting a new feature they call Smart HDR:
Smart HDR. Leveraging multiple technologies — like faster sensors, an enhanced ISP, and advanced algorithms — Smart HDR brings more highlight and shadow detail to your photos.
Here’s my single favorite XS-vs.-X comparison shot. I took it last Friday after Cheaper Than Therapy, a terrific stand-up comedy show in San Francisco co-produced by my friend and sometimes podcast guest Scott Simpson.4
iPhone X (original image file):
iPhone XS (original image file):
I have done no post-processing on these images other than to scale them to a smaller size, and I shot both with the iOS 12 Camera app by just pointing, framing, and shooting. The original images, untouched other than converting from HEIF to JPEG when exporting from Photos, are about 2.2 MB in size.
The difference speaks for itself.
It’s a small theater in a basement. The lighting is just right for the lobby bar of a standup theater, but terrible for photography — ambient light is dim, but the clown has a bright spotlight right on its face. The iPhone X image is blown out; the iPhone XS image looks pretty good. That’s entirely attributable to Smart HDR. Notice too how in the XS shot you can easily read the “Shelton Theater” sign in the back of the clown’s mouth.5
Cheaper Than Therapy does shows four nights a week and people take photos inside that clown every show. Simpson told me the clown’s face always gets blown out. He was genuinely impressed and he’s just a dumb comedian.
I should be showing you pictures, a lot more pictures, not telling you about them. And video clips. They speak for themselves. Alas, Daring Fireball isn’t rigged up for presenting a lot of photos. (Regular readers: “Really? I never noticed that, Grubes.”) I plan to publish a variety of comparison shots in a more appropriate venue after this review.
The way I understand it, Smart HDR is basically applied to all images from the iPhone XS. Sometimes more, sometimes less. If an image needs a little highlight recovery, a little Smart HDR is applied. If it needs a lot, it does more. But Photos only applies the “HDR” badge when it’s really extreme. It didn’t even apply the “HDR” badge for the shot with the clown above.
Here’s another example. This time, Portrait Mode shots of yours truly, taken by my wife at brunch Sunday afternoon. There was bright sunlight streaming through a window over my shoulder.
iPhone X (original image file):
iPhone XS (original image file):
I did not cherry-pick these two images. My wife took a bunch with each camera from the same position (her seat at the table), and the above images are representative of what all the photos from each respective iPhone looked like. Again, unusable vs. pretty good.
I never planned to put pictures of myself in this review. But these were so genuine. Not staged in the least. We didn’t pick our table. These portraits are how real people take real photos in the real world. Across a table from each other, enjoying a nice meal, on a nice day. Point, shoot. And this is what she got.
Here’s one more image from that bunch, which I didn’t use in the A/B comparison above because the framing and angle are slightly different.
iPhone XS (original image file):
Here’s a crop of that image at 100 percent:
I’ve never seen Portrait Mode on the iPhone X isolate individual strands of hair like that. That’s the exception rather than the norm on the XS, but still. [Update: As numerous sharp-eyed readers have pointed out, those hairs are in focus not because Portrait Mode did something right, but because it does something wrong — the depth mask has that whole dark column between windows behind my head in focus, and the side of my head just happened to fall in that region. It’s sort of a Bob Ross “happy accident” that I like the way this looks.]
My takeaway is that the Neural Engine really is a big deal for photography and video. Supposedly, it’s just as big a deal for AR, but the camera has been my obsessive focus this past week. For users, it’s a big deal because it has a dramatic, practical, real-time effect on the quality of the photos and videos they can shoot. None of this happens in post; all of it is visible live, as you shoot. And for Apple, it’s a big deal because I don’t think any of their competitors have something like this. Support for the Neural Engine permeates iOS and the entire A12 I/O system. Android handset makers can’t just buy a “neural engine” chip and stick it in a phone. Google does advanced machine learning — including for photos — but they do it in the cloud. You shoot a photo, upload it to Google’s servers, and they analyze the dumb photo to make it better. Their input is a JPEG file. (With the exception of their Pixel phones, where they do design the hardware and can apply machine learning on input from the sensors.)
With the iPhone XS and Apple Neural Engine, the input isn’t an image, it’s the data right off the sensors. It’s really kind of nuts how fast the iPhone XS camera is doing things in the midst of capturing a single image or frame of video. One method is to create an image and then apply machine learning to it. The other is to apply machine learning to create the image. One way Apple is doing this with video is by capturing additional frames between frames while shooting 30 FPS video, even shooting 4K. The whole I/O path between the sensor and the Neural Engine is so fast the iPhone XS camera system can manipulate 4K video frames like Neo dodging bullets in The Matrix.
Apple describes the XS as sporting “dual 12MP wide-angle and telephoto cameras”. This will be obvious to most of you, but in case it’s not, they’re not just dual rear-facing lenses, they’re dual rear-facing cameras. The wide-angle and telephoto lenses each have their own sensors. As a user you don’t have to know this, and should never notice it. The iPhone XS telephoto camera is the same as in the iPhone X — same lens, same sensor.
But the iPhone XS wide-angle camera has a new lens, which I believe to be superior to last year’s, and an amazing new sensor which is remarkably better than last year’s. And last year’s was very good.
I don’t want to wander too far into the weeds here, but bear with me. Focal length is how wide a lens is. A wide-angle lens has a lower focal length; a telephoto lens has a higher one. The camera industry advertises focal lengths in terms of their equivalence to a 35 mm film camera system. So on an actual 35 mm film camera or a full-frame DSLR (so called because the sensor is the same size as a frame of 35 mm film), a “28 mm lens” has an actual focal length of 28 mm.
Phone camera sensors are way smaller than 35 mm. They’re tiny in comparison. The lenses are tiny in comparison too. The actual focal length of a phone camera lens is much smaller than the focal length in 35 mm equivalent terms. So for example, the telephoto lens on both the iPhone X and XS has an equivalent focal length of 52 mm. That means if you took photos from the same spot with the iPhone XS telephoto lens and a full-frame DSLR with a 52 mm lens, you’d capture the same field of view in the resulting images from both cameras. They would appear to be equally wide. But the actual optical focal length of the iPhone XS telephoto lens is 6 mm. You can see the actual focal length of the lens used to capture any image in your library by opening the Info palette in the MacOS Photos app — the focal length is right next to the ISO value. It’s stored in the image as part of the EXIF data.
Here’s where it gets interesting. (I swear.) The iPhone X’s wide-angle lens had an equivalent focal length of 28 mm. Its actual focal length was 4.0 mm.
When I first started comparing side-by-side shots from the iPhone XS and iPhone X using the wide-angle lens, I noticed that the shots from the iPhone XS had a slightly larger field of view. They were a little bit wider. Look at the photos of the clown photo booth above and you can see it clearly. I didn’t move at all between those shots — both phones were roughly the same distance from the subjects, but the iPhone XS captured more of the scene. Apple confirmed to me that this is true — the iPhone XS wide-angle lens has an equivalent focal length of 26 mm. Not a lot wider, but enough to be noticeable. But when you look at the actual focal length of the lens in Photos (or any other app that can display the EXIF data of the image files), it is 4.25 mm.
0.25 mm may sound tiny but consider that the “telephoto” lens is only 6 mm. The equivalent focal length is wider, but the actual focal length is longer. This made no sense to me at first. Then I realized it would make sense if the camera sensor were a lot larger. And lo, here’s what Apple’s iPhone XS camera page says:
More low‑light detail. The camera sensor features deeper, larger pixels. Deeper to improve image fidelity. And larger to allow more light to hit the sensor. The result? Even better low‑light photos.
“Larger” is all they say. Not how much larger. That left me with the assumption that it was only a little bit larger, because if it were a lot larger, they’d be touting it, right? There are “field of view” calculators you can use to compute the sensor size given the other variables, so I used one, and by my calculations, the sensor would be over 30 percent larger.
I repeat: over 30 percent larger.
That seemed too good to be true. But I checked, and Apple confirmed that the iPhone XS wide-angle sensor is in fact 32 percent larger. That the pixels on the sensor are deeper, too, is what allows this sensor to gather 50 percent more light. This exemplifies why more “megapixels” are not necessarily better. One way to make a sensor bigger is to add more pixels. But what Apple’s done here is use the same number — 12 megapixels — and make the pixels themselves bigger. 12 megapixels are plenty — what phone cameras need are bigger pixels.
I think what makes this 32 percent increase in sensor size hard to believe, especially combined with a slightly longer lens, is that by necessity, this combination means the sensor must be further away from the lens. This basic necessity of moving the lens further from the sensor (or film) is why DSLRs are so big compared to a phone. But the iPhone XS is exactly the same thickness as the iPhone X, including the camera bump. (Apple doesn’t publish the bump thickness but I measured with precision calipers.) So somehow Apple managed not only to put a 32 percent larger sensor in the iPhone XS wide-angle camera, but also moved the sensor deeper into the body of the phone, further from the lens.
And to geek out even more, even though the XS has a wider field of view, because the actual lens element on the XS is longer than the X, it gets this wider field of view without introducing additional wide-angle lens barrel distortion — in fact, because the actual lens is longer, I suspect there’s less barrel distortion. Slightly less of that generally undesirable fisheye effect, even though the field of view is slightly wider.
Why isn’t Apple touting this larger sensor? Well, look at how long it took me to get to the end of this section. It’s a rabbit hole. They got out of this whole digression by just saying the sensor gathers 50 percent more light. On the one hand, that’s really all that matters. But on the other hand, to my ears at least, “50 percent more light” seems a bit hand-wavy.
“32 percent larger sensor”, however, means something very specific, and it should perk up the ears of any photographer — even one who’s skeptical of Apple’s “computational photography” claims. You could sell an upgrade to the XS to iPhone X-owning photo enthusiasts just by telling them the sensor is so much larger.
The other explanation I can think of is that this almost certainly isn’t Apple’s own sensor. Camera sensors aren’t something Apple designs on its own (yet?). So maybe they don’t want to call extra attention to something that is bound to appear in other high-end phones soon.
Apple isn’t celebrating this new sensor, but photographers will be.
I knew that last year’s iPhone X was thicker than the previous few years of iPhones, which was interesting in and of itself. But I didn’t think about it as part of a years-long trend. The iPhone 5 series phones (including the SE) were all 7.6 mm thick, with no camera bump. Ignoring the bumps, the iPhone 6 slimmed to 6.9 mm, and thickness crept up to 7.1 mm with the iPhone 6S and 7, and then up to 7.3 with the iPhone 8. (The Plus model counterparts of each generation were all 0.2 mm thicker.) One common refrain during the iPhone 6 era was Apple had taken its obsession with thinness too far. No one thought the iPhone 5 wasn’t thin enough, but lots of people had problems getting all-day battery life. Why didn’t Apple maintain the same overall thickness of the thin-enough iPhone 5 and use the extra volume for a slightly bigger battery?
Well, the iPhone X, XS, and XS Max are all 7.7 mm thick — just a hair thicker than the iPhone 5. And those iPhones all share the biggest ever camera bump — at the bump, they’re all 9.05 mm thick. And the iPhone XR is even thicker: 8.3 mm. Apple does tend to make its products ever thinner and lighter, but they’ve reversed course. Every model in the iPhone X series is thicker than any iPhone since the 4S, and each generation of iPhone has gotten thicker since the iPhone 6. I think this is the right trade-off, both for battery life and to allow for a bigger camera.
Apple is calling the iPhone XS Face ID “Advanced Face ID”. I asked why they weren’t calling it “second generation”, like they did with Touch ID, and was told it’s because Face ID is more of a system. Second-generation Touch ID was a single new component. A complicated component, but a component. The improvements to Face ID aren’t just a component in the sensor array — they’re tied to things like the Neural Engine in the A12. Is it actually an improved experience? I think so, but it’s hard for me to say because Face ID works so well for me with my iPhone X.
Apple describes the glass (front and back) on the iPhone XS as “the most durable glass ever in a smartphone”. I asked, and according to Apple, this means both crack and scratch resistance.
One imperfection: the antenna lines on the iPhone X were symmetric; on iPhone XS they are not. And a side effect of the new antenna line on the bottom left is that the speaker grille holes are no longer symmetric — there are 3 on the left and 6 on the right on the XS, and 4 on the left and 7 on the right for the XS Max. The iPhone X had 6 symmetric holes on each side. I guessed this new antenna layout was for dual SIM support, but I was wrong. It’s for 4 × 4 MIMO license-assisted access technology, which is how the iPhone XS supports gigabit LTE where available. I would trade symmetry for gigabit LTE — my only question is how many people can actually take advantage of it?
Update: The above is all wrong. Any iPhone or iPad running iOS 12 can edit the bokeh depth of field of a Portrait Mode shot from an iPhone XS or XR. Even better, when Mojave ships, Photos for Mac will be able to do it too. My mistaken assumption was reasonable, though. Before last week’s Apple event, I updated my iPhone X to the developer GM of iOS 12. Throughout my testing of the iPhone XS, that’s what I was using. But the developer GM build of iOS 12 didn’t have the depth of field stuff in it, because that was a secret held for the keynote announcement of the feature. If it had been in the GM, Guilherme Rambo or Steven Troughton-Smith would’ve found it before the keynote. That never occurred to me — and I thus assumed the GM build I was running was feature-complete. Anyway, I’m happy to be wrong about this.
The default wallpapers for iPhone XS (they’re actually soap bubbles, not planets, as I originally thought) hide the sensor array notch with a black background. Purely a coincidence, I’m sure. The notch was a worthwhile tradeoff. But the device looks better when it’s hidden by a black background, and that’s how the iPhone XS looks in the initial ad campaign.
I asked Apple about the “improved wireless charging”. It’s a classic S-cycle refinement. It’s not a new charging standard or support for a higher wattage or anything like. The iPhone XS has a new tighter coil design, which helps it to be more forgiving when it’s misaligned on the charging pad. I’ve run into this myself — sometimes with the iPhone X, if you’re not really paying attention when you put it on the pad, it either charges very slowly or not at all. The iPhone XS has a larger sweet spot.
There’s a cool XS-exclusive feature in Settings → Camera → Record Video. If you’re shooting at 30 FPS — whether in 720p, 1080p, or 4K — you can enable “Auto Low Light FPS”, which will drop the frame rate to 24 FPS on the fly whenever the phone deems necessary to get better low light exposures. This can happen in the middle of recording. Start recording in a bright room and move to a dark one or turn down the lights, and the frame rate will change within the clip.
For anyone upgrading from an older iPhone, the iPhone XS and XS Max should seem amazing in every regard. Compared to last year’s iPhone X, the XS and XS Max are solid refinements across the board, but deliver a dramatic year-over-year step forward in photo and video quality. If you care about the image quality of the photos and videos you shoot with your phone, it’s hard to resist.
What I find most interesting is that the two things responsible for that step forward — the A12 system (including the same Apple Neural Engine) and the much larger new wide-angle camera sensor — are included in the upcoming iPhone XR, which, for the same amount of storage, costs $250 less than than the XS and $350 less than the XS Max. I suspect there are a lot of people out there who don’t care about the telephoto lens on the XS and who don’t see much if any difference between the XR’s LCD display and the XS’s OLED one who are looking at these prices thinking they must be missing something. They’re not.
iPhones can’t compete with big dedicated cameras in lens or sensor quality. It’s not even close. The laws of physics prevent it. But those traditional camera companies can’t compete with Apple in custom silicon or software, and their cameras can’t compete with iPhones in terms of always-in-your-pocket convenience and always-on internet connectivity for sharing. In the long run, the smart money is to bet on silicon and software.
Mind your p’s and q’s; don’t forget to dot your i’s and cross your t’s. ↩︎
Personally, I never use my iPhone in landscape orientation except when using the camera, watching video, or playing a game. So that feature is meaningless to me — neither a plus nor a minus. ↩︎︎
I don’t have an XR in hand, and haven’t seen one since Apple’s event last week. If history is any guide, they won’t seed review units until mid-October, a few days before they start taking pre-orders on the 19th and 10-11 days before they start shipping and arrive in stores on the 26th. ↩︎︎
If you live in San Francisco or are ever in town and you enjoy standup, I highly recommend you go. It’s just great. Get tickets in advance, though — they’re sold out in advance most nights. ↩︎︎
“In the Back of the Clown’s Mouth” would be a good title for a detective novel. ↩︎︎