Apple and Snap Signal Local AR Commerce Ambitions

Share this:

Augmented reality continues to hold ample potential, though it is sometimes overhyped, as a new visual front end to our computing lives. That includes applications and use cases that range from gaming to product visualization. But one increasingly evident direction in which AR is headed is local commerce.

As we discussed recently with M7 innovations, one of AR’s applications in local commerce is AR lenses, which contextualize products in retail and QSR settings. AR is also inherently conducive to local commerce, as its geo-anchoring capabilities provide a foundation for geo-relevant local search and discovery.

This connection between AR and location is a key principle behind the AR cloud: a persistent data layer that’s anchored to the physical world to activate AR graphics. For example, Google wants to build an Internet of places — revealed through Google Lens — by indexing the physical world just like it indexed the web.

Payoffs for this vision include monetization potential — through advertising, affiliate revenue, or other models — to facilitate local offline commerce. It’s often forgotten that brick-and-mortar commerce (at least in normal times) accounts for a commanding majority of consumer spending.

AR Cloud: The Linchpin for Local AR

Visual SEO

All of this has been top of mind lately, but it was underscored further in announcements from Snap and Apple at their respective developer conferences.

Snap’s Local Lenses will let developers create geo-anchored persistent content that Snap users can discover through the camera interface. This will also include the ability for users to leave persistent AR graphics for friends to discover. The use case that Snap has promoted is more about fun and whimsy, including “painting” the world with digital and expressive graffiti (see video below). But the development could also include local storefront information.

One potential outcome of projects like Local Lenses is a wild card that we’ve been calling visual SEO. Just as local businesses apply rigor to local listings management for web search, a sub-sector could develop around optimizing data to show up prominently and correctly in AR and visual search experiences.

Separate Snap developments could meanwhile tie in. For example, Snap Minis are new HTML 5-based apps that will live in Snapchat’s Chat section and include micro-functionality like casual games and utilities. It’s similar in concept to Apple’s subsequently released AppClips (more on that in a bit).

Launch partners include Coachella (coordinate and plan a festival experience); Headspace (launch meditation sessions and send to friends); and Movie Tickets by Atom (choose showtimes, watch trailers, buy tickets). Together, these partnerships demonstrate a wide range of use cases for AR innovation.

Minis could also be developed to discover, plan, and transact local activities such as dining out. The model is what WeChat has done in China. Tying it back to AR and Local Lenses, Minis could provide the transactional layer that flows from some of the above geo-anchored AR experiences.

Halo Effect 

Moving on to Apple, it similarly continues to show its AR aspirations. The latest is GeoAnchors for ARkit, announced at WWDC.  These evoke AR’s location-based potential mentioned earlier by letting users plant and discover spatially anchored graphics that are persistent across sessions and users.

Similar to Google’s Live View AR navigation, this will tap into Apple’s Street View-like “Look Around” feature. GeoAnchors can use this data to localize a device before rendering the right spatially-anchored AR graphics in the right place. This will be Apple’s version of an AR cloud.

But the other key parallel is how, like Snap, these front-end AR experiences could be tied to the transactional infrastructure that Apple is separately building. As noted, Apple’s new “App clips” will atomize functions previously housed in full-blown apps, and make them more accessible on the fly.

With this tech at hand, real-world activities like pre-paying a parking meter will no longer require downloading the associated app. Instead, users will scan QR codes to access mini-app functionality to fill that parking meter, one of any number of possible use cases developers and Apple’s retail partners may devise.

This ties back to AR because there have been separate indications in “project Gobi” that Apple will plant QR codes at retail partner locations (think Starbucks), which will activate the smartphone camera to overlay informational AR graphics such as product information or promotions.

In that way, it could be the same infrastructure of QR codes that serves both AR experiences and related transactional functionality in AppClips. This will put some real utility behind AR, which the technology needs to pick up steam. Meanwhile, Apple’s halo effect and revenue incentives will drive brand and retailer adoption.

Of course, a lot of this is predictive and speculative in terms of Snap and Apple’s intentions. But the evidence aligns. And though the current state of the world isn’t very conducive to local offline commerce, Apple and Snap could be planting seeds for AR’s role in local commerce’s return.

Tags:
Mike Boland has been a tech & media analyst for the past two decades, specifically covering mobile, local, and emerging technologies. He has written for Street Fight since 2011. More can be seen at Localogy.com