Apple says it every year, but iPhone 13 cameras seem much improved

Cameras remain one of the biggest differentiators in smartphones, and Apple’s iPhone range is no different. Apple says the iPhone 13 and 13 Mini have the company’s “most advanced dual-camera system ever”, while the 13 Pro and Pro Max have “our three most powerful cameras ever.”

Which of course you hope for. But this year, Apple seems to be making a really big step with its cameras, especially with the Pro models. As always, the question will be what Apple can wring out of its hardware with image processing and software.

The iPhone 13 series marks the first time Apple has increased the primary camera sensor size across the board since the iPhone XS and XR in 2018, though last year’s 12 Pro Max had a 47 percent larger sensor than the 12 and 12 Pro. The sensor size is an important factor in image quality because it, along with the aperture, determines how much light the camera can capture. More light, less noise and blur.

The iPhone 13 and 13 Mini’s main cameras have larger sensors, which is part of the reason why the camera and ultrawide are now arranged diagonally in the camera bump. Apple has also added sensor-shift optical image stabilization, a feature first seen on the 12 Pro Max. It’s not clear exactly how big the sensor of the 13 is, but Apple says it will capture 47 percent more light than the 12.

The 13 Pro and Pro Max have an even larger primary sensor and a slightly faster f/1.5 lens that according to Apple captures 2.2 times as much light as before. Again, the exact sensor size isn’t advertised, but Apple did provide the pixel size: it’s 1.9m, which is larger than any modern smartphone I know. Apple can do this because the sensor has a relatively low resolution of 12 megapixels, but it’s still an impressive stat that should translate into better low-light performance. In comparison, the 12 Pro Max had 1.7m pixels, while every other iPhone since the XS had 1.4m pixels.

It’s not clear exactly what hardware changes Apple has made to the iPhone 13’s ultra-wide camera; the company simply says it has a “faster sensor” that “reveals more detail in the darker areas of your photos.” However, the Pro has significant hardware tweaks, as Apple increased the aperture to f/1.8 for a 92 percent improvement in light-gathering capacity. The sensor now also has focus pixels on board – things are rarely out of focus in ultra-wide shots because the depth of field is so great, but adding autofocus allows the camera to be used for macro photography, with a focusing distance of 2cm.

The telephoto camera remains exclusive to the 13 Pro phones, and Apple has increased the equivalent focal length to 77mm, or three times longer than the primary camera. Previously, the telephoto of the iPhone 12 Pro offered 2x zoom, while the 12 Pro Max went out to 2.5x. There’s a tradeoff here: if you want to frame something with 2x zoom, the 13 Pro has to crop from the main camera, which reduces image quality. But your photos beyond 3x zoom will be much sharper than before, and it should make for a better portrait lens. Apple has also added the night mode to the telephoto for the first time.

Compared to the Android competition, Apple doesn’t do much to beat them in terms of hardware. The large 1.9m pixels are noteworthy, but most Android phone manufacturers prioritize large, high-resolution sensors rather than pure pixel size. For example, Xiaomi’s Mi 11 Ultra has a huge 50-megapixel sensor with 1.4 µm pixels, which means it can collect quite a bit of light even when shot at its native resolution without merging the pixels. And while the 3x telephoto will be useful, it is now common to see 5x periscope telephotos (or even 10x occasionally) in the Android world.

So even though Apple has made significant hardware improvements to the iPhone 13 series, as always, performance against competitors will depend on how well the software and image processing pipeline is optimized. After all, the iPhone 11 was a vastly better camera than the XS the year before, even if the hardware barely changed. This year Apple is touting Smart HDR 4, which is able to individually adjust the exposure for multiple people in the frame, but we’ll have to see the phones ourselves to see what kind of difference that makes. The same goes for Photographic Styles, a new filter-like feature that Apple says is smarter at adjusting elements like skin tones and skies in each photo.

As for video, Apple makes a big deal out of Cinematic mode that lets you selectively adjust focus and depth of field during post-processing, such as portrait mode for photos. We will certainly have to test that extensively. The 13 Pro, meanwhile, lets you record and edit video in Apple’s ProRes codec on the phone itself, or you can export the ProRes file to Final Cut Pro on a Mac.

All the usual caveats about waiting for full reviews certainly still apply, but this looks like a pretty good year for the iPhone camera. Apple will never have the flashiest hardware, but it has made some welcome improvements in areas that make sense, and thankfully it hasn’t locked down any features for the Max-sized iPhone. We look forward to the results, as well as those of imminent competitors like the Pixel 6.