Exploits and Vulnerabilities Challenge the Integrity of Google Maps

Share this:

It’s been an odd few weeks for Google Maps. Back in April, the search giant was embarrassed when stunt edits from anonymous users began making headlines. In one of these, a user was able to get a bogus snowboarding shop called “Edwards Snow Den” listed at 1600 Pennsylvania Avenue, suggesting playfully that NSA whistleblower Edward Snowden was hiding out in the White House. In rather poorer taste, another user hacked Google Maps in order to create a “park” in India in the shape of Google’s Android peeing on the Apple logo, along with other “parks” shaped like the Skype logo and a happy face. Publicity around the incidents caused Google to crack down before Facebook and Twitter logos submitted by the same user, nitricboy, could be published on Maps.

The Snowden prank exploited a weakness in the Google Maps verification process that allows a user, once a business is claimed, to change key information such as the business address. As for the off-color Android exploit, nitricboy was able to make use of the same channel by which thousands of crowdsourced editorial contributions have seen the light of day on Google Maps. I’m speaking of Map Maker, Google’s public forum whereby users around the world improve the accuracy of Maps by adding points of interest and correcting errors. Users with a history of valuable (or uncontested) contributions to Map Maker are given a fast track to publication, a vulnerability nitricboy apparently turned against Google. In fact, one of nitricboy’s park pranks included the message “Google review policy is crap.”

Google quickly removed the Snowden prank and the Map Maker exploits from publication, but the company ultimately decided a more dramatic step was necessary. On May 11, Google’s Pavithra Kanakarajan announced that auto-approval and user moderation in Map Maker would be suspended until further notice due to “escalated attacks to spam Google Maps over the past few months.” All public edits to Maps as of May 12 became subject to manual review by Google staff, which given the volume of edits worldwide effectively meant Google was closing the door on Map Maker contributors, even if only temporarily.

Fast forward to May 19, when the Washington Post broke the story that certain phrases including the N-word entered in the Maps search box were returning the White House as a search result. Though the Post initially conflated this incident with the Android and Snowden pranks, it turns out to be unrelated. In a detailed May 20 analysis on Search Engine Land, Danny Sullivan observed that numerous racist and offensive terms were returning Maps listings as results, including unusual cases such as a San Diego record store specializing in rap that also turned up when the N-word was entered as a search term.

What was going on? Sullivan’s suspicion was confirmed by Google just a day later. Recent algorithm updates, in particular the Pigeon update, were causing Google to associate online discussions of a place with its Maps listing. The well-intentioned move was designed to improve Maps by creating networks of relevant key phrases linked to a business. When websites, online forums, or news stories mention a business in relation to key terms like certain products, services, or brands, Google may be more likely to show you the business in Maps when you enter that key term. For example, if a lot of people are talking about the garlic fries at a downtown restaurant, you might see that business in Maps when you search for “garlic fries,” even if the Maps listing contains no mention of that term.

Sounds like a smart idea, until you are confronted with the implications. If Google treats any online content as equally relevant, offensive discussions will inevitably enter the mix. But as Andrew Shotland has cleverly pointed out, the White House example may have more innocent sources as well, such as a historically relevant Wikipedia page that happens to link the White House with the N-word.

To address the latest problem, Google has adjusted its algorithms yet again. Shortly after the White House incident was made public, Google apologized and pledged to fix similar problems across the board. At about the same time, as evidenced by the recent buzz in the local SEO community, local rankings were upended, with Google location settings failing to function or delivering incompatible results for local pack and organic search. It’s unclear at this point whether the latest upset is actually related to the White House problem or not. If it is, the evidence suggests a widespread and perhaps overeager move on Google’s part to avoid further embarrassments at all costs.

The recent run of bad luck for Google doesn’t lend itself to a simple comprehensive explanation. There’s no obvious connection between loopholes in listing verification, a too-lax approval process for Map Maker, and the unforeseen consequences of an algorithm change. At the same time, each of these issues in its own way highlights a key problem for Google: how to build and maintain an accurate local dataset on a global scale. Automation and crowdsourcing are critical to scalability, but unlike Open Street Map or Wikipedia, Google Maps is not a commons. The company benefits from ownership of the data, but must also carry the blame for its failures.

Tags:
Damian Rollison is Director of Market Insights at SOCi. SOCi is the leading CoMarketing Cloud for multi-location enterprises. They empower nearly 1,000 brands to automate and scale their marketing efforts across all locations and digital channels.