That’s a long-standing problem dates back to the time of the movie: Image processing is usually fine-tuned for lighter skin tones, and not for black and brown subjects. Google announced an effort to address that today in its own camera and imaging products, with an emphasis on making images of people of color ‘more beautiful and accurate’. These changes are coming to Google’s own Pixel cameras this fall, and the company says it will share what it learns across the wider Android ecosystem.
In particular, Google is making changes to its automatic white balance and exposure algorithms to improve the accuracy of dark skin tones based on a broader dataset of images with black and brown faces. With these adjustments, Google wants to prevent people from being overly brightened and saturated in color in photos for a more accurate rendering.
Google has also made improvements for portrait mode selfies, creating a more accurate depth map for curly and wavy hair types – instead of simply clipping around the subject’s hair.
The company says it still has a lot to do – and it has certainly stumbled on image recognition and recording in the past – but it’s a welcome step in the right direction.