The biggest single hardware upgrade in the iPhone 14 Pro is the
main camera, which now has a 48-megapixel sensor, four times the
pixels of the iPhone 13 Pro. Apple has for years said (accurately)
that counting megapixels is not enough when it comes to measuring
the quality of a camera, and the 12MP camera in the iPhone 8 is
indeed a far cry from the 12MP camera in the iPhone 13 Pro.
True to its word, Apple has taken its flashy 48MP sensor and made
its default mode… a 12-megapixel image. The idea is that Apple’s
new “quad-pixel sensor” allows it to gather light from four
separate pixels and then combine them to create a 12MP image with
superior results, especially in low-light situations. And yes, I
saw much less noise in images generated in 12MP mode.
But Apple’s decision is still somewhat puzzling. While you can get
a 48-megapixel image out of the iPhone 14 Pro, you have to do it
by turning on RAW capture in the Settings app. These RAW captures
are slow — it takes a second or more for the camera to be
available to take another shot after you snap one — and they’re
huge (80 to 100 MB each). But they are also, especially in bright
light, spectacularly detailed. Yes, they can be a little noisy,
but with a little work in a RAW photo editor (I used Adobe
Lightroom Classic), I was able to make great-looking images that
had amazing levels of detail the likes of which I’d never been
able to do on an iPhone before.
Snell includes a bunch of interesting side-by-side examples in his review. I won’t quite argue that Apple was wrong not to include a 48 MP JPEG shooting mode, but it does seem like shooting RAW on the iPhone 14 Pro produces more impressive results than with previous iPhone generations. This new main camera sensor is impressive.