Facebook, Holocaust Denial, and the Refusal of Politics

Share this:

Facebook announced this morning that it would shift its stance on Holocaust denial, opting to remove content that denies the Holocaust occurred or distorts the facts surrounding it.

This comes two years after Facebook co-founder and CEO Mark Zuckerberg ignited a firestorm of controversy by using Holocaust denial as a paradigmatic example of the kind of content he personally finds offensive but would not remove from the world’s most popular social network. “I don’t believe that our platform should take [Holocaust denial content] down because I think there are things that different people get wrong,” Zuckerberg told Kara Swisher at the time.

Of course, Zuckerberg’s stated reason for leaving up Holocaust denial content — that “there are things that different people get wrong” — is faulty, whataboutism-style reasoning on its face. That a certain kind of false, hateful content should not be struck down because other people are posting other kinds of false, hateful content is not an ethical or logical reason to leave the first sort of hateful content up. As children understand, two wrongs do not make a right.

More generously, one might assume that Zuckerberg meant that Facebook should not remove the false, hateful posts of Nazi sympathizers because some Facebook content may fall less clearly into the binaries of fact and fiction, malignant and benign. Striking down clearly false content would put Facebook in the position of having to decide whether less clearly false and hateful content had to be taken down. Facebook did not want to get involved in these difficult questions of interpretation. Its defenders might even say Facebook does not have the resources to do so, though it did have the resources to easily swallow a $5 billion fine from the Fair Trade Commission for privacy violations.

Facebook’s implicit solution to this problem of interpretation presented by calls for more stringent content moderation was to interpret very little, striking down some forms of hate speech while leaving up even clearly false and dangerous content such as Holocaust denial. This ostensibly preserved the widest possible array of users’ ‘freedom’ to say whatever they liked. It assured Facebook could not be accused of bias against … Nazis, but also against more mainstream groups and political figures that sometimes promote or participate in hate speech. Facebook’s laissez-faire position also allowed it to continue to claim to be a platform, the very language of which suggests that Facebook merely provides a forum for speech but does not regulate it.

The well-known problem with this would-be laissez-faire, apolitical, or neutral approach to content moderation is that Facebook has long moderated its content, not simply by striking down particularly dangerous or false posts but by algorithmically sorting the kind of content that gets amplified, the groups to which users are invited, and the users they are invited to add as friends. All of this tips the scales and undermines Facebook’s claims to neutrality or mere platform status. Simply put, Facebook has always been an editor and orchestrator of attention, and even when it was refusing to take down posts denying the Holocaust, it was making an active and decidedly editorial decision. The company has never been neutral or apolitical; it has always been more publisher than platform.

In the case of the Holocaust, the attention Facebook orchestrates appears to have contributed to growing ignorance, especially among kids and young adults. Eleven percent of respondents to a recent survey about the Holocaust said they believed Jews caused their own genocide. Seven percent said they were not sure it had happened, and 3% denied it had happened. About half of millennial or Gen-Z respondents reported seeing Holocaust denial content online. Until days ago, Facebook’s stated position was that Holocaust denial did not constitute hate speech and therefore did not need to be taken down.

Here’s the thing: Facebook’s delay regarding Holocaust denial is not a good-faith ethical calculation. It does not emerge from a careful weighing of the value of free speech vis-à-vis the value of contesting anti-Semitism and preserving a common understanding of history. In this instance and in general, ethics and politics are secondary for Facebook. Primary are preserving attention, maximizing profit, and warding off regulation from Washington, where Republican lawmakers claim with little evidence that censorship of conservative ideas necessitates greater scrutiny of Facebook and its peers. In fact, it may well have been the threat of regulation that finally forced Facebook to ban Holocaust denial. Facebook’s potential role in Holocaust erasure would be a bad look for a company facing imminent hearings on its alleged monopolization of social networking. Legislators may question whether the Instagram and WhatsApp parent has become too large to manage itself. With Democrats poised to take the White House and Senate and a Big Tech trust bust on the table, taking a tougher stance against hate speech may have seemed the shrewd financial move.

That said, Facebook’s long-term refusal to strike down Holocaust-denial content is not a problem specific to Facebook. It’s not a decision limited to Zuckerberg or a few feckless executives. The problem is not even limited to tech. Facebook’s purported refusal of politics — its reluctance to accept that it has always been a political actor and that its content-moderation policies and algorithms have real-world effects on what people believe and what they do, up to and including acts of physical violence as in Myanmar — is a structural feature of shareholder capitalism. A content ecosystem whose leaders are so timid as to let Holocaust denial flourish is the logical result of an approach to management that views its only responsibility as minimizing costs and maximizing market capitalization.

In other words, the bottom line for Facebook is that accepting sociopolitical outcomes as part of the cost of doing business would be, well, costly. Facebook does not want to hire legions of ethicists or historians with subtle understandings of what makes something false or offensive to clean up its ecosystem. It does not want to invest in building an ethical and politically sophisticated approach to content moderation with the same gusto it has practiced when investing in engineering. Facebook wants to do the bare minimum on content moderation, paying contractors meager wages to do the dirty work while playing a “free speech” public relations war in regard to more ambiguous cases.

The upshot of Facebook’s shareholder capitalist refusal of politics is that toppling Zuckerberg, whose ‘unfireable’ command of Facebook is oft bemoaned, would not solve the social network’s shortcomings, transforming it into a beneficent steward of public discourse. Facebook under Zuckerberg infamously moved fast, growing as big as possible and breaking the information ecosystem on which democracy depends in the process. But Zuckerberg is not the only major corporate executive, Facebook hardly the only corporation, and tech not the only industry, to refuse politics for the sake of enriching a disproportionately wealthy class of shareholders. Facebook’s problem is a systemic economic and cultural one. Making Facebook more benevolent would require re-orienting the company’s fundamental goals in such a way that slimmer profits could be anticipated and even structurally validated in exchange for society’s betterment. Corporate America’s norms do not encourage that.

Last year, 181 members of the Business Roundtable pledged to rein in the excesses of shareholder capitalism, integrating concern for workers, the environment, and society more broadly into their managerial strategies. The corporate leaders pledged to “promote an economy that serves all Americans.” A recent study financed by the Ford Foundation and conducted by KKS Advisors found that the stated shift in values “has failed to deliver fundamental shifts in corporate purpose in a moment of grave crisis when enlightened purpose should be paramount.” Covid-19 put unprecedented economic pressure on US businesses. It does not seem to have inspired the care for workers and social welfare to which the Business Roundtable leaders claimed to aspire.

Zuckerberg or no Zuckerberg, a future in which Facebook orients its business around better sociopolitical outcomes is unlikely to arrive.

Tags:
Joe Zappa is the Managing Editor of Street Fight. He has spearheaded the newsroom's editorial operations since 2018. Joe is an ad/martech veteran who has covered the space since 2015. You can contact him at [email protected]