Triangulating Apple Maps: The Tech Angle

Share this:

This post is the latest in our “Mapping the Future” series. It will be an editorial focus for the month of July, and you can see the rest of the series here


Apple surprised the local search world last month when it announced local business reviews in Maps. Similar to its other search-based efforts, Apple formerly relied on partners like Yelp for local listings and reviews. But now, as part of its broader data-driven Maps overhaul, it will phase in original content.

Much has been written about this within the local search publishing world and analyst corps, including my colleague Stephanie Miles’ article on how brands can prepare for Apple Maps reviews here on Street Fight. So in the interest of treading new ground, what less-discussed clues lie in Apple’s recent mapping moves that can triangulate its direction?

How Are Brands Preparing for Native Ratings in Apple Maps?

Search What you See

The biggest Apple Maps development we’ve been tracking involves visual search. Sort of a cousin to augmented reality, this is about using ubiquitous and constantly-improving smartphone cameras to scan one’s surroundings. Those scans can then prompt features like navigation or store info.

Several tech giants are working toward different versions of this vision. Snapchat is rolling out ways to leave location-anchored graphics for friends to discover. Known as Local Lenses, these could be a step toward more practical and commercial use cases like storefront reviews.

Going back further, Google has long revealed its intentions to use the smartphone as a visual search tool for local discovery. Its Google Lens feature lets users hold up their phones at various objects to contextualize them. And like Snap, this will have local-commerce-based endpoints.

A related effort can be seen in Google’s Live View. Like Google Lens, it scans surroundings, then overlays relevant graphical info. But instead of storefront info, it overlays walking directions. It does this by tapping Street View imagery for object recognition to “localize” your spatial position.

Is Visual Mapping the Next Google-Apple Battleground?

Sensor Bundle

That brings us back to Apple, which is now doing something similar. In iOS14, Apple Maps will let users get a more accurate location reading by scanning nearby buildings. This is useful in urban canyons where GPS signals bounce off buildings, thereby degrading the calculation of where you’re standing.

Apple is sidestepping that issue by letting the camera take over where GPS fails. This elevates the camera as a tool in the bundle of sensors (GPS, IMU, etc.) that have traditionally been used in the mobile local search era. And it’s one of several camera-based mobile local features we can expect.

This move could have been predicted given Apple’s efforts to reboot its underlying maps data — the underpinnings of any computer vision-based system. This is primarily to upgrade and modernize a core iOS function and to alleviate the black eye still felt from last decade’s Mapgate.

Look Around is one component of that mapping reboot. Its a Street View-like feature that goes a step further with 3D depth data, compared to Street View’s patchwork of 2D images. This is the base data that will power the new positioning feature introduced above, among other things.

Is Google Building an “Internet of Places?”

Follow the Money

When predicting the above moves, or any tech-giant moves, it’s useful to follow the money. This is because their moves often trace back to the drive to protect or pave the way for core revenue streams. For example, Google’s visual search is to future-proof its massive core search business.

In other words, Google’s DNA is all about indexing things. It now wants to index the physical world just like it indexed the web: a sort of “internet of places.”  Payoffs will include monetization potential to continue to facilitate local offline commerce — an extension of the path it’s been on for years.

Back to Apple, its Maps reboot is to differentiate iOS and sell more iThings. The immediate benefit of the above mapping features will be a more attractive core mapping app in the iOS default tray. Longer-term, it engenders AR-fueled local search and revenue diversification in the face of falling iPhone sales.

Through this lens, Apple’s latest visual-localization feature could be one of many computer vision-based features. These will be tied to its broader AR efforts where it has a lot riding. Apple hopes much of the above will acclimate the world to visual interfaces so that its AR glasses hit the ground running.

Add it all up, and there are a few puzzle pieces for local’s next visual era. It could be more “heads up” than the downward-held UX we’ve experienced for the past decade (Facebook thinks so, too). This will be a gradual transition — as all behavioral shifts are. But a few signs point in that direction.

Tags:
Mike Boland has been a tech & media analyst for the past two decades, specifically covering mobile, local, and emerging technologies. He has written for Street Fight since 2011. More can be seen at Localogy.com
Previous Post

Location Weekly: Education Apps Sell Location Data

Next Post

The One Where Adtech and Martech Get Married