But as people tried to capture the scene, and the confusion and
horror that accompanied it, many noticed a strange phenomenon:
Certain photographs and videos of the surreal, orange sky seemed
to wash it out, as if to erase the danger. “I didn’t filter
these,” tweeted the journalist Sarah Frier, posting photos she
took of San Francisco’s haunting morning sky. “In fact the iPhone
color corrected the sky to make it look less scary. Imagine more
orange.” The photos looked vaguely marigold in hue, but not too
different from a misty sunrise in a city prone to fog. In some
cases, the scene seemed to revert to a neutral gray, as if the
smartphones that captured the pictures were engaged in a
conspiracy to silence this latest cataclysm.
The reality is both less and more unnerving. The un-oranged
images were caused by one of the most basic features of digital
cameras, their ability to infer what color is in an image based
on the lighting conditions in which it is taken. Like the people
looking up at it, the software never expected the sky to be
bathed in orange. It’s a reminder that even as cameras have
become a way to document every aspect of our lives, they aren’t
windows on the world, but simply machines that turn views of that
world into images.
This is not a bug, but a side effect of the built-in Camera app on iOS (and likewise on most Android phones) being decidedly consumer-focused. Setting a manual white balance point is a feature in any “pro” camera app worth its salt. My favorite for iPhone is Halide — a recommendation shared by many others. From Halide’s Twitter:
We saw a lot of attention yesterday as people used Halide to take
photos of the eerie orange skies in places hit by wildfires.
We got significantly higher downloads.
It feels wrong to benefit from this, so we are donating
yesterday’s sales to our local Wildfire Relief Fund.
What a move.