Is the Camera the 'New Search Box' for Local Discovery? Part II | Street Fight

Is the Camera the ‘New Search Box’ for Local Discovery? Part II

Is the Camera the ‘New Search Box’ for Local Discovery? Part II

 

This post is an excerpt from an upcoming Street Fight white paper. Entitled “Local’s Visual Future: The Rise of AR and Visual Search,” it examines how trends in visual technologies are manifesting in local commerce. An excerpt is below and the report will be available in the next few weeks. Stay tuned for more excerpts as well as the full report. More Street Fight coverage of this topic can be read here.

Location Data’s Time to Shine
It’s often said in the ad-tech world, and other sectors that are reliant on data, that “Content is King, but Data is God.” This is increasingly true in local ad-tech and martech given the need for “ground-truth” conversions to attribute ROI. And it will equally apply in local AR.

Geo-relevant data will play an important – though often overlooked – part in the AR user experience. It’s not widely recognized that Pokémon Go was built on the architecture of Niantic’s Ingress game, whose location tags were set over years and made the whole thing work.

Location data’s importance to AR was also validated by Snap. The company recently acquired two location data companies – Placed and Zenly – that contribute to the company’s continued product development around geo-relevant snaps and AR content like World Lenses.

As background, rudimentary AR such as Pokemon Go sometimes use geotagged data to position graphics. More advanced AR, such as ARkit, conversely use object recognition to map and “register” objects before applying relevant and dimensionally accurate graphical overlays.

Though the latter is a more advanced form of AR, it will still benefit from location data such as business/product details or coordinates. This one-two punch will especially be additive in apps for navigation, local discovery, tourism, retail and several other location-relevant use cases.

Prime Position
Fortunately a few startups have built systems that collect, clean and optimize geo-data. Aisle 411 has store layouts and product data. Foursquare and xAd validate business’ lat-long coordinates, while Yelp and others hold valuable customer reviews, menu items and operating hours.

These data will come in handy with local AR apps that let users point their phone at a storefront to get useful info. Google Lens (explored in the next section) accomplishes this through object recognition using Street View imagery, and local business data from Google My Business.

Google’s visual positioning service (VPS) is another example (also explored below). It lets users navigate interior spaces, such as a Home Depot, using graphical overlays on their smartphones to lead them. But it requires product and blueprint data, and 3d scans of hundreds of stores.

Google has the deep pockets and data assets to pull this off. But the question is whether or not local AR developers will have access to such product or location data. This could be a business opportunity for companies that currently collect, clean, index and optimize local business data.

And the way this will likely play out is through Saas-delivered access to location databases that give developers the tools they need to build functional location-relevant AR apps. This could breathe new life into already-valuable location data from publishers and vendors in local.

Drilling Down on Google: Visual Search
Speaking of Google and its moves with VPS and Google Lens, the company is bringing AR in slightly different directions, more aptly described as “Visual Search.” Like AR, they utilize computer vision to identify surroundings, but it’s less about graphics and more about information.

Google Lens for example allows users to point their smartphone camera at any physical world item to launch a search for qualifying or purchase information. One promising use case is identifying storefronts and other key business details when exploring a new neighborhood.

This of course aligns with Google’s mission and the protection of its core search business, where it makes the majority of its $60 billion in revenue. Visual search for Google is a way to broaden its touch points to continue serving consumers information in a lean forward “pull” manner.

This all goes back to Google’s challenge in the mobile era: people are searching less because they’re spending more time in apps and less time in the browser… where Google is the front door. So its goal is to counterbalance that decline in search query volume through other means.

This was the principle behind Google’s “Micromoments” – to inspire content snacking moments when Google can deliver you things through Google Assistant (formerly Google Now). The latest embodiment of that principle has been voice search (another report) and Visual Search.

“A lot of the future of search is going to be about pictures instead of keywords,” Pinterest CEO Ben Silberman told CNBC in April. Pinterest since launched several feature updates that let users search for items on its mobile app by pointing their smartphone cameras at physical world items.

Last Mile to the Cash Register
VPS is a related “visual search” technology that applies computer vision to scan interior spaces and form a point cloud. That unique digital fingerprint then becomes the basis for positional tracking, indoor navigation and overlaying practical information for shoppers in physical stores.

The go-to example is overlaying positional data for store shelves and the items they carry. As pioneered by interior mapping companies like Aisle411, shoppers can then find obscure items in retail spaces – like the above Home Depot example – solving a consumer pain point.

“GPS can get you to the door, and then VPS can get you to the exact item that you’re looking for,” said Google’s VR/AR lead Clay Bavor at Google’s Spring I/O conference. “Imagine in the future your phone could just take you to that exact screwdriver and point it out to you on the shelf.”

This is not necessarily a new message because Beacon proponents have been saying this for years. VPS is a superior technology, but its optical and sensory components have been cost prohibitive for smartphone integration. But ARCore’s technology described earlier changes that.

But it goes beyond the utility of finding things. The real angle here is the ad attribution potential, given that it tracks the “last mile” to the cash register. This once again ties to Google’s search ad business, and is just the latest move to embolden search marketing ROI with better attribution.

1 thought on “Is the Camera the ‘New Search Box’ for Local Discovery? Part II

  1. Fascinating article! I plays perfectly into what I think is one of the biggest questions I ask all of my clients as a Google Trusted virtual tour photographer, “what do you want people to see?” As you may know the virtual tour photography program now called Street View Trusted, has been collecting visual data since 2012. It has morphed in branding and changed its focus over time, but in essence, from what I believe you’re suggesting in this article, it’s likely to be the core content that creates the data needed to make all this work. What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

Name *

1 thought on “Is the Camera the ‘New Search Box’ for Local Discovery? Part II

  1. Fascinating article! I plays perfectly into what I think is one of the biggest questions I ask all of my clients as a Google Trusted virtual tour photographer, “what do you want people to see?” As you may know the virtual tour photography program now called Street View Trusted, has been collecting visual data since 2012. It has morphed in branding and changed its focus over time, but in essence, from what I believe you’re suggesting in this article, it’s likely to be the core content that creates the data needed to make all this work. What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

Name *