Snap Scan Shines New Light on the Company’s Local Discovery Ambitions

Share this:

Snapchat’s ambition to become a socially fueled local discovery engine recently got a boost with Snap Scan, a visual search tool that makes the world searchable and shoppable. Connecting dots over several years, Snap’s geo-local efforts include Geofilters, Snap MapLocal Place Promote, and Local Lenses.

Before going into Snap Scan and its recent slate of updates, what’s visual search? This flavor of AR positions the camera as a search input. It applies machine learning and computer vision magic to identify items you point your phone at. Seen in products like Google Lens, it’s all about indexing and annotating the world.

Of course, “indexing and annotating” is Google’s jam, which makes it a natural leader in visual search. But Snap is intent on this, too, as it aligns with its “camera company” ethos and growing AR chops. Erstwhile focused on whimsical AR use cases like selfie lenses, visual search flips the Snap script to more utilitarian use cases.

It also literally flips the script as it shifts from the front-facing camera (selfie fodder) to the rear-facing camera to augment the broader canvas of the physical world. By doing so, Snap’s visual search capacities open things up to commerce and a wider range of potential advertisers than the AR products that can go on one’s face.

Is Snap Building a Local Discovery Engine?

New and Unproven

Snap Scan took a step forward recently with a slate of updates. It now recognizes more objects and products, which is a function of its computer vision and machine-learning database. It can now identify general-interest items like pets, flowers, and clothes — a list that will continue to grow.

Snap plans to continue building this capability by partnering with best-of-breed players in these product areas. For example, its partnership with Photomath lets Snap Scan solve math problems. By pointing one’s phone at a math problem on a physical page, it can solve the problem, to the delight of math students everywhere.

Its partnership with Allrecipes meanwhile lets users scan food items in their fridge to get recipe suggestions. This can also be used in a grocery store aisle. Similarly, the new Screenshop feature will overlay product information and purchase details from retail partners when users scan style items in the real world.

Beyond new use cases and product categories, Snap Scan is now more accessible. It’s been moved to prime real estate in the Snapchat UX, right on the main camera screen. This should help it to gain more traction and condition user habits. Because visual search is still new and unproven, users need a reminder that it’s there.

What’s Snapchat’s Local Play?

Outfit Inspiration

Snap wants Scan to be a launchpoint for shopping by letting users scan clothes for “outfit inspiration.” This could be visual search’s killer app, as it makes the physical world shoppable — meaning both utility for users and monetization potential for Snap.

Beyond fashion, visual search shines whenever products have unclear branding, non-standardized pricing, and uncertain availability. For example, wine is natural for visual search as it is a complex product where average price isn’t always transparent. Visual search can qualify buying decisions with content like reviews.

If this all sounds familiar, it’s because it’s just like web search but in 3D. In that sense, visual search carries some of the user intent that has always made search so lucrative. Consumers who actively scan products represent qualified leads for brands and retailers … just like those who type product names into a search bar.

This natural user intent has business-model alignment for companies built on ad revenue. Beyond Google and Snap, that list includes Pinterest. Its Lens product aligns with the company’s product discovery use case. Projecting these same factors, Instagram could be the next sleeping giant for visual search and “outfit inspiration.”

Back to Snap, the real play may be in visual search’s endgame: AR glasses. Future Spectacles will likely feature Snap Scan to discover the world more naturally. That form factor will take a while to acclimate culturally. But when and if it arrives, its value and reason for being could be amplified by local visual search.

Mike Boland has been a tech & media analyst for the past two decades, specifically covering mobile, local, and emerging technologies. He has written for Street Fight since 2011. More can be seen at