Home Tech For Google’s Pixel camera team, it’s all about the memories

For Google’s Pixel camera team, it’s all about the memories

0 comment
Image may contain Head Person Face Body Part Neck Photography Portrait Happy Smile Accessories & Jewelry

Another clear example of this is when a model pulls a surgical mask up over her face – her skin appears darker in the Pixel 8 Pro video, while there’s not as much wobble on the Pixel 9 Pro. This also affects situations where there are multiple people with different skin tones together, and Koenigsberger says there should be less exposure distortion.

When analyzing photos taken on-site, it wasn’t hard to discern the algorithm updates, especially with the luxury of having the models in front of me. Even in normal lighting conditions, the Pixel 9 Pro’s skin tones looked a lot more like people’s in real life, to my eyes, than the Pixel 8 Pro’s did. Koenigsberger says this is also due to general changes to Google’s HDR+ imaging workflow (more on that later), which allows the system to produce more accurate shadows and midtones.

Another new change is automatic white balance segmentation, which helps separate the exposures of people in the image from those of the background. You may have previously noticed some color bleeding into an environment, such as a blue sky that produces a cooler tone on the skin. This new system helps “keep people as they should look, separated from the background,” says Koenigsberger.

Portrait taken with the Google Pixel 8

Portrait taken with the Google Pixel 9

This year’s Pixel 9 series is also the first time that Google’s skin tone classifier has fully aligned with the Monk’s skin tone scalea 10-shade scale released to the public that represents a wide range of skin tones, intended to help with everything from computational photography to healthcare. Koenigsberger says the change allows for much more precise color adjustments.

Most importantly, for the first time, Real Tone has been tested for all of Google’s “Hero” features across the entire Pixel 9 range. Koenigsberger says his team has been able to expand testing to ensure new features like Add Me have been tested for Real Tone before launch. That’s important because Koenigsberger says his team can’t always dedicate that much time to testing on the Pixel A-series phones, which could be why I had some issues with Real Tone on the Pixel 8A. Hopefully expanding this process will help, but Koenigsberger says it brings Real Tone from a specific set of technologies into Google’s operating philosophy.

“Ultimately, it’s about a memory for someone,” Koenigsberger says. “It’s going to be their experience with their family, it’s going to be that trip with their best friend; it’s going to be as close to recreating those experiences when we test them, I think, the more reliable we’ll be able to give people something they’re happy with.”

Artificial memories

Memories are the underlying theme driving many of the new features from Google’s camera team. Earlier in the day, I sat down with Isaac Reynolds, the product manager for the Pixel Camera group, which he’s been a part of since 2015 with the launch of the first Pixel phone. Coming up on its 10th anniversary, Reynolds says he’s probably “more excited than many others” about mobile photography, as he believes there’s still plenty of room for improvement in cameras. “I see the memories that people can’t capture because of technical limitations.”

New camera features on Pixel phones are increasingly focused on specific use cases rather than wholesale changes to the overall camera experience, though Reynolds says the HDR+ workflow has been rebuilt on the Pixel 9 series. There are re-tuned exposure, sharpness, contrast, and shadow blending, plus all the Real Tone updates, which help create a more “authentic” and “natural” image, according to Reynolds. He suggests it’s what people prefer compared to the more processed, punchy, and heavily filtered images that were so popular a decade ago.

You may also like