Facebook, Free Speech, and the Responsibility of Power
Facebook is a media company and a tech company. At a time when we frequently speak of digital media without questioning what the phrase means, there should be no reason to expect a border between those two kinds of enterprises. In today’s (digital) media landscape, tech and media go hand in hand. Technical properties and infrastructure provide the power to mediate what we see, read, and believe. Facebook, like Twitter, Snapchat, Netflix, and Amazon, is both a tech and media company because it controls a vast technological apparatus that mediates the way its 2+ billion users experience reality.
It is particularly apparent that, as one of the largest, if not the world’s single largest and most powerful media company, Facebook has a responsibility to monitor the content on its site: the content it mediates, the stories, comments, photos, and overall speech that shape Facebook users’ perception of their world. But even if one were to argue that Facebook were not a media company, wouldn’t everyone have to accept that Facebook mediates, and massively so, our reality? And if we agree on that point, I’d then ask that we all ponder—what kind of responsibility comes with Facebook’s tremendous power to mediate? What great responsibility comes with such great power?
I want to suggest briefly here that the many arguments adduced to spare Facebook the responsibility of monitoring its content, of removing content that leads to physical violence all the way down to false political advertising, fail because they are based on under-developed and unimaginative understandings of responsibility itself. To argue that Facebook should be spared almost all regulatory expectations because it is a technology like the telephone rather than a media site like the New York Times or that Facebook should not be entrusted with taking down false advertising or striking down violent speech because those are tasks best left to the government is a failure of imagination and a failure to imagine what (civic) responsibility entails. As the word suggests (respons-ibility), the responsibility of any company or person who provides the possibility of speech, who can take it away from any given user and makes billions in profits off it, is to answer for and consider the admittedly unpredictable and deeply complex ramifications of the speech spoken under the company’s or person’s auspices. Facebook’s scale makes that a daunting task, but it is that very scale—Facebook’s vast power and resources—that makes it incumbent upon the company to work harder than any other media organization to ameliorate its ecosystem.
Mark Zuckerberg has said no one deserves as much money as he’s earned off Facebook; as a response to that, he’s giving much of it away. But more urgent for the society his company sculpts, or mediates, daily than Zuckerberg’s ever-growing fortune is his ever-growing power and the power of his company. It’s a power that comes with the responsibility to think big about responsibility, not to shy away from regulating Facebook because the government should take care of its problems or because it may not be a media company or because we live in a land of free speech, which does not mean companies bear little to no responsibility for monitoring the language, ads, and images they host.
If Silicon Valley were the hotbed of innovation its champions like to claim it to be — if that innovation superseded the scope of profits for managers or engaging products and extended to the impact of those products on society and the responsibility of their stewards — we would not be having a conversation each month about whether Facebook should take down false and inflammatory content. We would be asking rather how Facebook can be even more responsible with no predetermined limit on the methods it could innovate to reach that goal.