Google’s Latest Privacy Play Has Big Implications for the Open Web

Google Elevates Mapping and Visual Search

Share this:

Google’s annual Search On event was held this week, including several updates to core search and orbiting products like Maps. Given the prominence and centrality of mapping to local commerce, we’ll start there.

In addition to mapping (and included within it) there was an emphasis on all-things visual. This includes more visual ways to search, such as using images to find craveable food items nearby. This builds from tools like Google Lens and “Multisearch Near Me,” having lots of local search implications.

With that backdrop, here are some of the notable highlights from Search On.

Search with Live View

Live View is Google’s 3D/AR mapping feature for urban walking navigation. It utilizes its Street View image database to localize a device (recognizing where you’re standing) and overlay AR directional arrows on your route.

This week’s update adds a search component. You can now search for businesses, in addition to navigating to a specific address. By holding up your phone to a given streetscape, you can search for a business (say, ATMs) to see them revealed visually. And naturally, you can then navigate to the closest one.

“You can just lift up your camera and see overlaid on the real world the ATM that’s nearby,” said Google VP and GM of Geo Chris Phillips at the event. “You can also see coffee shops, grocery stores, and transit stations. You really get a sense of what an area is like at a glance.”

In addition to indicating the whereabouts of these businesses, Google will reveal details like hours of operation and other vitals. Google is uniquely positioned to do all of the above given the data it’s been collecting in years of local search. And it’s only scratched the surface in local visual search.

Multisearch Near Me

Speaking of ongoing developments in visual search, Multisearch Near Me fits that description. For those unfamiliar, it lets users combine various search inputs to find the things they’re looking for. Start a search for a new jacket using an image, then refine the search with text (e.g. “the same jacket in blue”).

After previewing this feature at Google I/O, Google announced that the feature is live in the U.S.. Moreover, it has baked in more local discovery use cases. In addition to the fashion use case noted above (and finding local retailers), you can now search for local restaurants with food images.

For example, say you see a tasty dish on Instagram. You can now use that image to identify the dish with Google Lens and then use Multisearch Near Me to find local restaurants that serve the same or similar fare. This flips the script with local search, atomizing it to desirable items, rather than business listings.

“This new way of searching is really about helping you connect with local businesses,” Google VP and GM of Search Cathy Edwards said at the event, “whether you’re looking to support your local neighborhood shop or you just need something right away can’t wait for the shipping.”

Neighborhood Vibe

Google Maps now also lets users get a sense of the vibe of a given neighborhood. This happens through a new SERP interface that highlights things like photos and discussions from the Google Maps community. The idea is to surface the scuttlebutt on trendy places and inside info on a given hood.

Google is hoping that the characteristics that emerge are things like artsy vibes or emerging foodie cultures. This aligns search with some of the attributes that people may be looking for. Like the above, It also aligns with the trend toward atomized attributes that define search results (paging Christian Ward).

These search evolutions help Google future-proof its business. It sees the world moving towards more AI-driven discovery. Google also knows that users don’t naturally think in the way that search operates (keywords and link-based SERPs). So, all these moves drive towards more natural language search.

Immersive View

Lastly, Google’s Immersive View is less about AI and algorithmic refinements and more about sheer sex appeal. Also previewed at this summer’s Google I/O, immersive maps feature stylized birds-eye views of various locales. The use case is more confined to travel planning and wanderlust.

This move also follows the ongoing mapping one-upmanship as Apple continues to raise its game in Apple Maps. Its last few updates have included similar immersive 45-degree birds-eye views. Google has fired the latest shot with heavily animated and stylized renditions of popular locales.

But this move isn’t just about fancy graphics. It is Google after all, so it wants to differentiate by tapping into its rich data reserves referenced above. In this case, it will surface dynamic meta-data for a given location, including weather, traffic and crowds — as it currently does in Google Local Searches.

Immersive Maps has already racked up views of 250 global landmarks and will launch for users in the coming months in Los Angeles, New York, San Francisco, London, and Tokyo. More cities will be subsequently added pursuant to Google’s global coverage ambitions.

Mike Boland has been a tech & media analyst for the past two decades, specifically covering mobile, local, and emerging technologies. He has written for Street Fight since 2011. More can be seen at