Snapchat’s scan feature can identify dogs, plants, clothes, and more

Snapchat’s camera has so far been mostly associated with sending disappearing messages and crazy AR effects, like a virtual dancing hot dog. But what if it did things for you, like suggest ways to make your videos look and sound better? Or show you a similar shirt based on the one you’re looking at?

Starting Thursday, a feature called Scan will be upgraded and put in front of the app’s camera, allowing it to identify a range of things in the real world, such as clothing or dog breeds.

Scan’s prominent placement in Snapchat means that the company is slowly becoming not just a messaging app, but also a visual search engine. Scan also helps address a growing problem for Snapchat users: how to find the millions of AR effects or lenses created by Snap’s creator community. With the ability to suggest lenses based on what you’re looking at, Scan could give more visibility to the lenses people make, encouraging them to keep creating AR content for Snapchat.

Visual search is not a new idea. In 2017, Google debuted Lens, which allows users to scan items through their phone camera and identify them using its comprehensive search results index. Lens is integrated into the Google Pixel phones and a number of other Android handsets, as well as baked into Google’s main mobile app. Pinterest also has its own visual search feature called Lens that shows similar images based on what you scan into the app.

Video still by Weston Reel for The edge

While Snap is catching up, arguably it has a better chance of making the idea of ​​visual search mainstream. As Snapchat opens up to the camera, any change there will have major implications for the way its nearly 300 million daily users interact with the app. Snap says more than 170 million people already use Scan at least once a month — that was before it was put front and center of the camera, as it is today.

“We definitely think Scan will be one of the priorities for [Snapchat’s] camera is moving forward,” Eva Zhan, Snap’s head of camera product, told me The edge in an exclusive interview. “In the long run, we see the camera doing a lot more than it can do today.”

Snap first started working on Scan a few years ago after seeing Snapchat users embrace scanning profile QR codes as a way to add friends in the app. After initially working with Shazam to identify numbers and Photomath to solve math problems through the camera, Snap added the ability to identify items for sale on Amazon.

This latest version of Scan, which Snap previewed at its developer conference earlier this year, adds detection for dog breeds, plants, wine, cars, and nutritional information. Most of Scan’s features are powered by other companies; for example, the app Vivino is behind the wine scan feature. Soon, Allrecipes will power a Scan feature that suggests recipes to make based on a specific food ingredient. Snap plans to add more capabilities to Scan over time with a mix of external partners and what it builds internally.

Scan’s biggest new addition is a shopping feature built by Snap and aided by the recent acquisition of Screenshop, an app that lets you upload screenshots of clothing and purchase similar items. Scan can recommend similar clothes based on what you look at and let you buy clothes you discover. Scan’s shopping feature will also soon be added to Snapchat’s camera roll section called Memories, allowing people to buy clothes based on what they’ve saved from their camera or screenshots.

Another core pillar of Scan is what Snap calls camera shortcuts. It works by recommending a combination of a camera mode, soundtrack and Lense. So when you point the camera at the sky, lenses specifically designed to work with the sky appear next to a song clip and a color filter, so you can apply all the changes at once. According to Zhan, Snap is working on adding camera shortcuts to its TikTok rival Spotlight, potentially allowing the viewer of a video to quickly jump into their camera using the same setup used to create the video they just watched.


GIF: Snap

I initially liked Scan’s camera shortcuts, but they’re currently limited to just a few situations: shots of the sky, human feet, dogs, and dancing. Snap plans to expand the situations where camera shortcuts work over time, and the integration with Spotlight shows how they can become a more integral part of the video creation process.

Snap wants Scan to be an important way for users to discover AR lenses in the future. It recently started letting its AR makers tag their lenses with relevant keywords that help Scan suggest the right lenses based on what the camera sees.

Video still by Weston Reel for The edge

After testing the new Scan over the past few weeks, I thought it hit the mark. There were many instances where Scan misidentified things or didn’t work at all, such as when it didn’t recognize there were clothes I was trying to get results for, as well as times when it worked perfectly. Sometimes the suggested Lenses were relevant, and other times they were clearly not contextually recommended at all.

That said, Snap promises that Scan will get better over time, both in its ability to identify things accurately and with new categories of objects it can detect. No data from Scan is currently used for ad targeting, but it’s easy to see how the feature could monetize more shopping or advertising efforts down the road.

Photo: Snap

Scanning becomes more appealing in a future world with people wearing AR glasses, such as the latest Snap Spectacles. It doesn’t feel natural to me to point my phone at things in the real world to identify them, but the behavior makes more sense when I’m wearing smart glasses that can scan my surroundings.

Snap is already anticipating this: the new Spectacles have a special scan button on the frame that activates Lenses based on what the wearer is viewing. (The new Spectacles are not for sale. Instead, Snap will give them to select AR creators and partners who request access.)

While Scan is pretty bare-bones now, it shows how Snap is evolving the camera usage scenarios. Snap sees Scan as an important part of Spectacles — and possibly other cameras — in the future, Zhan said. “We definitely don’t want to limit Scan to just the Snapchat camera.”