Every time Apple launches a new iPhone we get about a week to test before we can publish our initial review. As someone who primarily focuses on the camera features, it’s always a challenge to find a visually compelling spot that will allow me to fully test what the new camera is capable of.
This year, however, I got lucky. Immediately after I picked up the phones in the Bay Area I got on another plane to Alaska where I’ve been driving overland trucks and fly fishing around Denali. This far north it’s the middle of fall, with bright yellow trees covering the landscape, and moose hunters roaming the woods to collect meat for the winter. It’s been dumping rain most of the time, but we also got one day of partial sun, and that night the northern lights popped out at 2 a.m.

Needless to say, I’ve had plenty to photograph in a variety of lighting conditions that have helped me dig into the camera upgrades. Once again, I’m impressed with the visual features Apple has been able to pack into a tiny device that also serves as a phone, GPS, TV screen, AI assistant, and many other things. But as I always remind readers, iPhones don’t come close to the power of a full-frame mirrorless camera, and the iPhone 16 Pros are no exception. Here are my initial thoughts on why the 16 Pros are a great adventure camera, but just one tool in a photographer’s belt.
We Love the Camera Control Button (But It Takes Some Getting Used To)
Apple keeps the hardware design of the iPhone purposely simple (no headphone jack, for example) so it’s a big deal when they give us a new button (as silly as that sounds). This year they’ve added a Camera Control button, which is designed specifically for photographers and gives immediate and easy control of several different manual camera features.
One hard press of the button, which sits on the lower right-hand side of both Pros, launches the camera. Another hard press takes a photo. But the real power comes when you double soft-press the button (sort of like half-pressing the shutter on a camera) which brings up an entire menu of controls such as exposure, aperture, and focal length.

To select a manual control like exposure, you slide your finger on the button until that control is highlighted, and then soft press it once to enter. Next, you again slide your finger along the button to make a change. Holding the camera horizontally, a swipe right on the button when in aperture control makes the shot bigger for less depth of field. A swipe left on the button when in zoom control makes it zoom out and access the phone’s ultra-wide camera. Another soft double press gets you back to the original menu so you can then click into and change another feature.
I was excited to see Apple launch this button because the best photos are often made when you have as much control as possible over your camera. While shooting landscapes in Alaska, for example, I wanted lots of depth of field so I could gather details in the foreground and background and keep everything in the frame as sharp as possible. While shooting white overland trucks, it was helpful to bring my exposure down a little so that the vehicles weren’t blown out when set against a darker background.
As photographers know, mastering a new camera takes a while, and the same will be true with the Camera Control button. I suspect it will be months before I create the muscle memory needed to use the button quickly and without thinking. I also know that one button on the side of a camera will never match the buttons, dials, and toggles that photographers can use to quickly control bigger mirrorless cameras.

A heads up for anyone who lives in a rainy environment. While shooting in an Alaskan downpour it was hard to get the Camera Control button to react to my sliding finger, so I had trouble changing my exposure or focal length. Many times I had to wipe the button dry with my shirt in order to get it to work. Keep in mind that when you buy a phone case you’ll need to make sure you buy one that either comes with a cutout so that you can directly access the button, or one that Apple has certified so that the phone case button covering the Camera Control button interacts appropriately and doesn’t cause any weird delays or interferences.
Apple says that later this year a software update will allow the Camera Control button to lock focus on a subject so that we can then shift the camera and composition but not lose our main point of focus. This is a technique that many pro photographers use to get more creative framing in their photos and a smart update from Apple.
More Resolution Is a Good Thing
The iPhone 15s and 15 Pros all came with a larger 48-megapixel (mp) main camera (26 mm and 24 mm equivalents, respectively), which was a big deal because those cameras provided enough resolution to make photo prints large enough to hang on your wall. Last year I commented on seeing more photos on the walls of Apple HQ, and the same was true this year, with iPhone photos hanging all over the spaces we toured.
Unfortunately the 15 Pros had a bit of a shutter lag when shooting 48 mp photos in Apple ProRAW because the files that format creates are huge. But with the 16 Pros there’s zero shutter lag on the standard camera (24 mm equivalent), thanks to a second-generation quad-pixel sensor that reads data twice as fast, and something called the Apple Camera Interface that transfers higher levels of data from the sensor to the chip. So photographers can now capture action at the largest resolution possible.

The other big news for the 16 Pros is that their ultra-wide (13 mm equivalent) camera now comes with a 48 mp sensor as well. There’s some shutter lag on this camera when you’re shooting at 48 mp in Apple ProRAW, but I was still able to use it to photograph people fly fishing and overlanding. Thanks to the wide angle and resolution, it was easy to get clean and crisp subjects in the foreground but also capture subtle details in the landscape, creating a photo with lots of depth and character. I’ve yet to print a photo off the ultra-wide camera but suspect I’d have no problem making something that is 11×14 inches or even bigger.
The only hedge I’ll include is a reminder that even though the iPhone now has 48 mp sensors behind multiple cameras, that doesn’t mean that the iPhone photos are anywhere as detailed as a full-frame mirrorless camera that comes with a similar resolution but a much larger sensor and larger lenses. Those larger lenses and sensor drink in more information and will always win in the resolution game.
You Can Now Shoot Slow-Mo 4K Video
Most iPhone users aren’t using their phones to create commercial music videos like the one Apple and showed during the iPhone 16 Pro keynote presentation. That said, it’s definitely convenient to have higher-resolution 4K slow-mo video that you can use to capture action or add drama to social videos.
The new features enables the phone to capture video in 4K at 120 frames-per-second (fps), and the Photos app lets you adjust the playback speed after capture. That means you can watch your video at full speed, or dial it down to half speed, quarter speed, or even one-fifth speed. I shot a guide fly fishing and friends driving trucks through puddles and found it enormously helpful to be able to choose my playback speed in order to best highlight the action.
For example, I chose half speed for the fly fishing video because that was slow enough to emphasize the casting movements but not so slow it made the video boring. When I was editing the cars chewing through puddles, however, I slowed the video all the way down to one-fifth speed because I liked the drama of the splash coming at me as slowly as possible.
The Styles Feature Has a Lot of Potential
All iPhone 16s come with the ability to control something called Styles. At first glance these are just a new set of filters, but Apple says that’s not the case. Instead of applying one simple tone to the entire photo, like some older filters, Styles alter the color balance and tonality in a more sophisticated way that leaves things like skin tone more natural while still adding a certain overall feel to the rest of the photo. There are preset Styles developed by Apple, but each of those can be modified by the user in the Photos app. Note that at this point Styles only work in the HEIF format and do not work on Apple ProRAW files.
Apple says that photographers have created their own tonal styles for decades and that they drew on this history when building Styles. For me, I immediately thought of how modern photographers use Presets in Adobe Lightroom that do something similar. Photographers will create a Preset, or dozens of Presets, that alter the overall tonality of a photo so that everything with that Preset has a consistent feel.
When done well, these Presets help photographers nail an aesthetic that’s uniquely theirs and creates a visual consistency that you might compare to an author’s tone of voice. I asked a pro adventure photographer I ran into in Alaska what he thinks of Presets and he said he uses them all the time. That said, he warned that it’s taken him hundreds of hours to create his set of Presets, and he’s always tweaking them. His advice leads me to believe that Styles, when used best, will not be a magic wand, but instead an advanced tool iPhone users will need to spend some time with to master.
Everything Else You Need to Know About the iPhone 16 Pro
Last year just the iPhone 15 Pro Max came with a 5X zoom (120 mm equivalent), but both 16 Pros now feature that lens and it sits in front of a 12 mp camera. I’ve tested the 5X over the past year and it’s been a fun new way to capture the world and create unique perspectives. For video it’s been great to get in ultra close to the action for a more personal experience, and on the photo side it’s perfect for portraits where you still want a little background info (unlike Portrait mode, where the background is completely blurred).
A new feature called Audio Mix that launches with the 16 Pros will undoubtedly help YouTubers and other social videographers in a big way. There’s a lot to this feature, but what’s most important to know is that when you record video of people talking in a noisy area, Apple’s software can now go into that video and cut out the background noise almost completely or just quieter if you prefer, so that you can hear the forward conversation much more crisply. The effect is like attaching body mics to your subjects or using an overhead mic like you see in film production. It remains to be seen if video creators who use an iPhone ditch their mics completely, but Audio Mix will certainly allow some people to just bring their phone.
Apple Intelligence, or Apple’s version of AI, will be released later this year and, according to Apple, will include important updates to the Photos app. One that caught my eye is the ability to use detailed natural language queries to search through the thousands of photos stored on your phone and in your cloud. Apple says we should be able to type something as specific as “Maya skateboarding in a tie-dye shirt” into the Photo app’s search bar and it will find all the photos that meet that description, even if they’re buried in photos from four years ago. You can also use this same kind of query to search videos. Your phone will sort through all your footage to find the exact spot where Maya is skateboarding in that specific t-shirt.
Finally, Apple Intelligence will also allow users to remove distracting elements from a photo with a few swipes. Personally, I think this is great for photos like family portraits, or anything staged, but not something I’m interested in when it comes to other photography. AI features like this, in my opinion, destroy the character of a photo and it should be the job of the photographer to compose a photo how they want it instead of relying on AI to clean it up or enhance it after the fact.