Google’s Latest Privacy Play Has Big Implications for the Open Web

Connecting the Dots on Google’s Visual Road Map

Share this:

Google continues to double down on visual search and navigation. Its latest move came last week with updates to its Live View visual navigation to help users identify and qualify local businesses. This follows soon after its Earth Cloud Anchors that will let users create digital content on physical places. Both developments signal what may be a key component of local search’s future: a visual front end.

We’ll dive into each of these updates and what they mean collectively. But first, as background, Google continues to invest in visual search as a sort of hedge to future-proof its core search business. Given Gen-Z’s affinity for the camera, Google wants to lead the charge to make it a search input.

This includes Google Lens, whose “search what you see” proposition lets users point their cameras at real-world objects to contextualize them. This starts with general interest searches like pets and flowers, but the real opportunity is a high-intent shopping engine that’s monetized in Google-esque ways.

Live View similarly uses the camera to help users navigate with 3D urban walking directions. Instead of the “mental mapping” to translate a 2D map to 3D space, holding up your phone to see directional arrows is more intuitive. And like Google Lens, above, monetization is on the road map.

Google is uniquely positioned with both of the above visual-search products because they tap into its knowledge graph and the data it’s assembled from being the world’s primary search engine for 20 years. Lens taps into Google’s vast image database for object recognition, while Live View uses Street View imagery.

Last week, Google announced a new feature that sort of combines Lens and Live View. While navigating with Live View, Google now offers small click targets on your touchscreen when it recognizes a business storefront. When tapped, expanded business information appears.

This is something Google has been teasing for a few years. As shown above, it can include business details that help users discover and qualify new businesses. The data flow from GMB and the feature’s current version offers a small set of structured listings content like hours of operation.

As mentioned, this update came a few weeks after Google’s less-discussed Earth Cloud Anchors. Another vector in Google’s overall visual search and AR master plan, this ARCore feature lets users geo-anchor digital content for others to view. This could feed into the above Google Lens and Live View updates.

In other words, Cloud Anchors could be a sort of user-generated-content component of local visual search. It could have social, educational and serendipitous applications such as digital scavenger hunts and hidden notes for friends. But a local business ratings and reviews use case could likewise develop.

As a side note, Google isn’t the only one working on “persistent AR” — a core tenet of the AR cloud. Snapchat launched Local Lenses to do something similar, Apple has GeoAnchors, and Facebook acquired AR Cloud startup Scape to help build its AR LiveMaps. This signals lots of investment in location-based AR.

Back to where this could go next, we’ve speculated in the past that Google Lens could engender a new flavor of SEO that focuses on visual experiences. If a visual front-end resonates with Gen-Z and phases into ubiquity, it could compel businesses to optimize listings for that visual experience.

Again, much of the data that populates these user experiences will simply flow from GMB, so there likely wouldn’t much extra work required in this prospective branch of SEO. But there could be certain types of visual media that shine in visual-search experiences and cause local businesses to stand out.

The more likely scenario is that a visual front-end will simply reinforce the existing reasons to ensure local listings accuracy. In other words, inaccuracies will be that much more apparent in a visual interface. You don’t want to get it wrong when someone’s standing in front of your store pointing a phone at it.

Of course, the above is speculative. Outcomes will hinge on the wild card that is user traction. If the use case falls flat, this won’t be a channel that local businesses need to worry about. Though AR enthusiasts like me think visual search is compelling, it’s yet to be seen if people will use it en masse.

And that’s why Google continues to tread carefully into AR and visual search. It jumped too quickly at VR before deprecating the Daydream platform. AR is much closer to search, but Google needs to feel out the user behavior and optimize the UX first. Then, just like search itself, monetization could follow.

Tags:
Mike Boland has been a tech & media analyst for the past two decades, specifically covering mobile, local, and emerging technologies. He has written for Street Fight since 2011. More can be seen at Localogy.com