This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.
Apple’s iPhone 14 and iPhone 14 Plus smartphones get better main and selfie cameras, but if you’re a serious smartphone photographer, you should concentrate on the iPhone 14 Pro and Pro Max announced Wednesday. These higher-end models have a 48-megapixel main camera designed to capture more detail and, in effect, add a whole new telephoto lens.
The $999 iPhone 14 Pro and $1,099 iPhone 14 Pro Max start with a better hardware foundation. Their main camera’s image sensor is 65% larger than last year’s, a boost that helps double its low-light performance, said Victor Silva, an iPhone product manager. Low-light performance, a critical shortcoming in smartphone cameras, triples on the ultrawide angle camera and doubles on the telephoto.
But it’s the 48-megapixel sensor that deserves the most attention. It can be used two ways. First, the central 12 megapixels of the image can act as a 2x zoom telephoto camera by cropping out the outer portion of the image. Second, when shooting in Apple’s more advanced ProRaw format, you can take 48-megapixel images. That’s good for taking big landscape photos with lots of detail or for giving yourself more flexibility to crop a photo without losing too much resolution.
Cameras are one of the most noticeable changes in smartphone models from one year to the next, especially since engineers have embraced thicker, protruding lenses as a signature design element. Customers who might not notice a faster processor do notice the arrival of new camera modules, like the ultrawide angle and telephoto options that now are common on high-end phones.
Apple unveiled the new camera technology at its fall product launch event, a major moment on the annual technology calendar. The iPhone itself is an enormous business, but it’s also a foundation of a huge technology ecosystem deeply embedded into millions of peoples’ lives, including services like iCloud and Apple Arcade and accessories like the new second-generation AirPods Pro and Apple Watch Series 8.
Pixel binning comes to the iPhone
Apple has stuck with 12-megapixel main cameras since first using them in the iPhone 6S in 2015. The approach offered a reasonable balance of detail and low-light performance without breaking the bank or overtaxing the phone processors that handle image data. But rivals, most notably Samsung, have added image sensors with 48 megapixels and even 108 megapixels.
More pixels aren’t necessarily better. Increasing megapixel counts means decreasing the size of each pixel, and that can hurt image quality unless there’s lots of light.
But by joining 2×2 or 3×3 pixel groups together into a single virtual pixel, an approach called pixel binning, camera makers get more flexibility. When there’s abundant light, the camera can take a 48-megapixel image that lets you dive into the photo’s details better. But if it’s dim, the camera will use the virtual pixels to take a 12-megapixel shot that suffers less from image noise and other problems.
When shooting ordinary photos with the iPhone 14 Pro models, Apple will take 12-megapixel shots, whether with the ultrawide camera, the main wide-angle camera, the 3x telephoto camera or the new 2x telephoto mode that uses the middle of the main camera sensor. To get the full 48 megapixels, you’ll have to use Apple’s ProRaw mode, which offers more detail and editing flexibility but requires some manual labor to convert into a conveniently shareable JPEG or HEIC image.
“You can now shoot ProRaw at 48-megapixel resolution, taking advantage of every pixel in the main camera,” Silva said. “It’s unbelievable how much we can zoom in.”
The 2x telephoto option uses the 12 million relatively small pixels in the center of the 48-megapixel main camera sensor. That will mean worse image quality than shooting with the full 48 megapixels. But Apple, which increased that sensor size considerably compared with last year, says even those pixels are still bigger than on previous iPhone 2x telephoto cameras.
“We can go beyond the three fixed lenses of the pro camera system,” Silva said. The 2x mode can shoot 4K video, too. Although its pixels are a quarter the area of the main camera in 12-megapixel mode, the 2x mode still gets the main camera’s relatively wide f1.78 aperture lens for better light gathering abilities than many smartphone telephoto cameras.
Other iPhone 14 Pro camera changes
Among other changes coming with the new phones:
- The main camera on the iPhone 14 Pro models now has a 24mm equivalent focal length, a bit wider than the 26mm lens Apple has used for years. That’ll accommodate group shots and indoor scenes better, where photographers are more likely to benefit from the main camera’s better low-light performance, but it’ll mean you’ll have to get closer to portrait subjects if you want to fill the frame.
- The ultrawide camera is sharper, improving the close-up macro photos, Silva said.
- Apple updated its camera’s flash with a nine-LED system that controls the pattern and intensity of light to accommodate the cameras’ different fields of view. It’s twice as bright in some conditions.
In Apple’s presentation, it didn’t shine the spotlight on the 3x telephoto camera, a focal length unchanged from the iPhone 13 Pro to the iPhone 14 Pro. In its press release, it said the camera is “improved” but didn’t share further details. Apple didn’t respond to a request for comment.
Anshel Sag, an analyst at Moor Insights & Strategy, would like to see Apple go further, like Samsung has done with its 10x telephoto camera on its Galaxy S22 Ultra. “I love the 10x,” Sag said. “I use it all the time.”
Meet Apple’s Photonic Engine
Much of the improvement in smartphone photography relies on changes that are less visible. Faster processors, including graphics processing units, image processors and AI accelerators, are critical to new computational photography software that’s core to the smartphone photography revolution. In Apple’s new iPhone 14 models, it calls its latest processing system the Photonic Engine.
This technology is an advance over Apple’s previous Deep Fusion technology for merging multiple frames into one shot, preserving detail and texture when lighting is modest or dim. With the Photonic Engine, Deep Fusion begins earlier in the image processing pipeline, working on the raw image data to better preserve detail and color, said Caron Thor, Apple’s senior manager of camera image quality.
All iPhone 14 models get a new action mode that can be toggled on for better stabilization when you’re running around with your camera. It’s not yet clear whether the feature crops the image more tightly, a common consequence when gathering imagery from only the central portion of the video, which remains more consistently in view.
The iPhone 14 Pro cameras also add 4K support to the Cinematic Mode that Apple debuted with the iPhone 13. That mode artificially blurs background parts of the video to focus on the main subject. If a new person enters the frame, the mode can switch focus accordingly.
The iPhone 14 Pro cameras also include upgraded image stabilization that should improve photos and videos.
iPhone 14 and 14 Plus camera upgrades
Apple’s lower-end iPhone 14 and iPhone 14 Plus get a new main camera that gathers 49% more light, with a larger sensor and a wider f1.5 aperture so its lens can let in more light, Thor said.
The Photonic Engine technology improves low-light photography on all the new phones’ cameras, though. Low-light performance doubles on the selfie front camera and ultrawide angle back camera, and it improves by a factor of 2.5 with the main camera, she said.
The new selfie camera on the iPhone 14 and 14 Plus has an f1.9 aperture that boosts light gathering by 38% compared with the iPhone 13. And for the first time, it also has autofocus to avoid blurry faces.