Machine learning is slowly but surely becoming a ubiquitous presence in digital technology, one that is likely already having an impact in local search.
To understand its importance in consumer technologies today, consider the case of Snapchat Lenses. Lenses are those wacky overlays in Snapchat that make you and your friends look like puppy dogs or fairies wearing flower wreaths. They work by means of computer vision technology created by a Ukrainian company, Looksery, which Snapchat acquired for $150 million in 2015.
Computer vision married with machine learning is the technology that helps Facebook and Google organize your photos according to who is in them, helps self-driving cars figure out what’s on the road, and lets you deposit a check into your bank account by taking a picture of it on your phone.
For Snapchat, computer vision gives the app the ability to recognize the outline of your face by analyzing patterns of contrasting pixels. But in order to identify your specific facial features, Snapchat makes use of machine learning. The company employs a statistical model of a face shape that has been trained by people marking the borders of facial features on thousands of sample images.
This data feeds into an algorithm that understands what an average face looks like. When you show the app your face, it maps the average face onto yours and then “learns” how your features differ, customizing the model where necessary in order to create a unique grid overlay that can be decorated or distorted by Snapchat lenses.
(A terrific video from Vox explains all of this in more detail.)
According to Arthur Samuels’s classic 1959 definition, machine learning “gives computers the ability to learn without being explicitly programmed.” Machine learning can also be described as “an algorithm or model that learns patterns in data and then predicts similar patterns in new data.” In the Snapchat example, an already sophisticated facial mapping algorithm becomes truly innovative because it is capable of adjusting itself to trace the specific contours of faces it has never encountered.
To understand the implications for local search, we can look, as I’ve done before, to the innovations coming from Uber. The new update just announced — a significant overhaul to the user experience of the Uber app — is centered on machine learning. Uber wants to understand and learn from user behavior in order to streamline the process of ordering rides. They intend to do this by finding patterns in the rides you’ve taken in the past, so that the app can get you quickly to your favorite destinations. And while you’re on the way home, Uber will offer you the option of ordering food for dinner with Uber Eats. Or if you’re going out for the evening, you can send a Snapchat message from the Uber app to let your friends know you’re on the way.
Whether by means of tracking your ride history or offering third party integrations to help you connect Uber rides to your other activities, the company’s strategy is to anticipate the future needs of users by paying close attention to their past behaviors — to learn patterns in data and predict similar patterns.
Didier Hilhorst, Uber’s design director, helpfully explains that first among the company’s goals in the redesign was saving time for users. “If you think of a lot of apps,” Hilhorst says, “they want your time. Think of Facebook, Twitter, or Netflix; they take the time and sell the time. Uber is in a different business – we want to give time back. We want to be respectful of your time and get you on your way as fast as possible.”
Enter the local search use case. As with ride sharing, the purpose of local search apps like Google Maps and Apple Maps is not to draw you in to an immersive social experience, but rather to get you on your way to your preferred destination as quickly as possible.
The challenge is in the comparatively messy dataset that local search applications must sift through. Rather than simply getting you from one address to another, local search apps must show you the best choices among businesses of myriad categories and descriptions, and must provide enough differentiating information to allow users to make an informed decision. To an extent, the more useful data a local search app contains, the less easily it can fulfill the basic need of providing the best answers quickly so users can be on their way.
This strikes me as a prime target for machine learning. Though Apple and Google are already silently saving your favorite destinations and offering recommendations in popular categories based on location and time of day, huge potential remains to make use of data gathered through user activity to predict future needs. Think of Netflix movie recommendations or Pandora stations, where your previous ratings and consumption habits contribute to a custom set of recommendations for what the algorithm predicts you will like.
For local search, the equivalent might be predicting which new restaurant you’ll like based on its similarity to past restaurants you’ve visited or rated favorably — or in a more sophisticated vein, predicting what you’ll like based on your similarity to other users. Usage patterns might also drive recommendations; if you live in the San Francisco Bay Area and love Philz Coffee, you might appreciate being told that a Philz location is nearby in a town you don’t usually visit.
To work as seamlessly as Snapchat’s Lenses, machine learning and predictive analytics need to meld seamlessly with core app functionality. The technology needs to “just work,” without steep learning curves or frustrating dead ends. For these reasons, I’d expect any company who experiments with machine learning for local search to start with a simple set of problems and hone the user experience in order to showcase the value of the technology.
Who knows? A Netflix of local businesses could be on its way to a phone near you.