I’ve been on an ethics kick recently – this is the third column with that general theme – and I suppose it’s the inevitable result of being inundated each and every day with the raw details of our current political situation, presenting as it does dizzyingly frequent opportunities for public figures and private citizens to take ethical stands on subjects ranging from education to the environment to the very nature of our democracy.
But this column isn’t about that. It’s about ethics and pragmatism and Google and, eventually, some implications for local search. The question I’m contemplating comes down to this: if enough people believe something, should Google consider it to be true?
I’m inspired by and indebted to Danny Sullivan’s March 5 piece calling attention to what seems a worsening problem. We’ve all seen Google’s recent attempts to answer questions with a single, prominent response in the SERP. Sullivan calls this the “One True Answer” feature because he feels the single best answer to your question is what Google is aiming for. I’d say it’s the next evolution after the “I’m Feeling Lucky” button, which has been around from the beginning; in some sense, it’s always been Google’s calling to guide you to the one link that best suits your particular query.
Most of the time, this works pretty well. If I Google “how do you tune a mandolin,” for example, Google gives me what it thinks is the best answer on the internet at the moment, and I’m not dissatisfied with the result.
But as with all algorithms, things can go wrong. Sullivan raises the example of Google’s answer to the query “presidents in the ku klux klan.”
I don’t get the same result when I run the same query, but it’s similar enough to demonstrate that Google hasn’t yet come up with a way to fix the problem. In my case the answers are based on an article titled “11 People You Wouldn’t Believe Used to Be in the KKK,” from a site called AllDay.com. The article contains, among other things, the easily debunked assertion that President Warren G. Harding was “purportedly sworn into the KKK while in the White House.”
As I’m sure is the case with many of you, I’ve become well enough accustomed to the ways of clickbait to question the authenticity of a site called AllDay.com, even with nothing to go on but its name. The number of sites with vaguely journalistic names that publish vaguely journalistic articles like this one has exploded in the last couple of years, making most users wary. But in Google’s case, it must be true that the ubiquity of such articles and their popularity has made it hard, from the algorithm’s perspective, to tell truth from fiction.
That’s where I start thinking about pragmatism – not the familiar word meaning a sensible, practical attitude but rather the term as it is used in philosophy. The pragmatists were the first school of philosophers founded in the United States, with members including William James, brother of the novelist Henry James. Their founding member, Charles Sanders Peirce (1839-1914), is remembered as one of the thinkers who predicted the rise of modern computing, but his main claim to fame was pragmatism, an approach to philosophy that tried to set aside unanswerable questions about the nature of truth and judge the value of a theory by the effect it has in everyday life. William James famously applied this tactic to religion in his book Varieties of Religious Experience, where he suggested that whether or not certain religions were true in an absolute sense, their value could be determined by their ability to help people lead happier lives.
Peirce applied pragmatic philosophy to scientific knowledge, suggesting, for example, that the closest we can get to any absolute truth in science is simply the consensus of modern scientists. In other words, we can’t ever know the true nature of black holes or dark matter – or, to choose a topic of contemporary urgency, climate change – so our best method of determining what’s true about these phenomena is essentially polling the people whose job it is to seek those answers and trusting most firmly in the claims with which most of them agree.
Back to Google and the KKK then: though of course it’s very complex and nuanced and any short description is an oversimplification, the Google algorithm is essentially designed to find truth through consensus. Google’s 200 or so ranking factors add up to the unitary goal of assigning rank to sites based on a logically determined likelihood that they contain trustworthy information that is relevant to my query.
In a world where questionable news is very popular, it’s not so surprising that Google’s logical assumptions might sometimes produce unexpected results. After all, trustworthiness at root is a matter of how many people are willing to trust you. In Peirce’s terms, it might seem as though the most pragmatic decision would be to let truthiness win in the SERP, since that’s what, in many cases, the public apparently demands. Of course, that conclusion would be wrong, but it isn’t necessarily easy to determine why or what to do about it.
The difference, to be clear, between Peirce and Google is that Peirce believed scientific truth was the responsibility of credentialed scientists. It was, in other words, a curated truth, with trust placed in the curators. Google wouldn’t be Google if it were curated; instead it represents something like the consensus of everyone. Probably one could make the further claim that Google’s algorithm relies on an assumption that most people, most of the time, want to relay the facts, and that it should be possible to discriminate clearly between facts as the norm and spam as the anomaly.
When that consensus is at issue in terms of who owns the facts, such assumptions can be undermined. But you might say that this is what we signed up for in building an unrestricted internet. Now it’s Google’s job to tighten the screws on what qualifies as the best answer to our question, just as Facebook and Twitter in their own ways are working to meet the challenge of maintaining a standard that doesn’t conflict with their basic neutrality.
I still haven’t said anything about local search, but the implications are there. Google is shutting down Map Maker on March 31, in part to close off a rampant source of spam, though the company will still solicit user edits through polling and via the “Suggest an edit” feature in Maps. This is one in an ongoing series of balancing measures in local where Google is constantly seeking the right mix of inputs so that Maps can represent the most trustworthy source of local information. Though the inputs are somewhat different, the challenge is more or less the same as that faced by Google search as a whole.