I can only imagine what the emails flying back and forth at YouTube headquarters look like today. Let’s recap the company’s last 36 hours: On Tuesday night, YouTube announced that it would not discipline Stephen Crowder, a conservative comedian-vlogger-personality who had, seemingly in contravention of YouTube policy, been insistently targeting Vox journalist Carlos Maza. Crowder had called Maza, among other things, a “gay Mexican,” a “lispy queer,” and an “anchor baby,” while his followers had harassed Maza on his personal phone. YouTube seemed to suggest that Crowder was in the clear because his “criticism is focused primarily on debating the opinions expressed.”
The response from journalists and tech critics to YouTube’s ruling was not … enthusiastic. On Wednesday afternoon — still communicating largely via its Twitter account — the company seemed to change its mind, announcing that it would demonetize Crowder (that is, suspend ads on his channel so that he couldn’t make money directly from videos). Crowder’s “deeply offensive” opinions may not have violated company policy, but, the company’s review had determined, his “pattern of egregious actions has harmed the broader community,” which meant YouTube would no longer pay him.
Unless, that is, he stopped selling T-shirts? An hour after demonetizing Crowder, YouTube clarified that the vlogger could remonetize if he removed links to merch he was selling — specifically, a T-shirt that reads “Socialism is for f-gs.” Shortly after, it reclarified: “This channel is demonetized due to continued egregious actions that have harmed the broader community. To be reinstated, he will need to address all of the issues with his channel.”
So, to sum up: YouTube has ruled that Stephen Crowder is primarily interested in debate, but that he has also harmed the broader community of YouTube with egregious actions based on his deeply offensive opinions, which are, by the way, not violations of company policy. He will remain on the website, but will not be able to make money directly from it, unless he removes a link and addresses all of the issues with his channel, whatever they may be.
Also, they’re going to take a look at all of these rules, and might change them? To cap off the saga, on Wednesday night YouTube published a new blog post taking “a harder look at harassment,” in which the company pledged to take “a hard look at our harassment policies with an aim to update them.”
Was this a positive sign? Can YouTube craft a harassment policy that would prevent “egregious actions” without punishing those “primarily interested in debate”? Platforms like YouTube love to craft and recraft their rules, because clear, detailed rules seem like they might bring some semblance of order and predictability to the seething and volatile social structures they oversee. This is both a theoretical matter — as a business, YouTube relies on the daily effort of hundreds of thousands of vloggers, who need some assurance that they will be dealt with predictably and that their livelihood will not be interrupted unfairly — and a practical matter: it’s much cheaper to hand a rule book over to inexpensive contractors than it is to train up thousands of people in the complicated political-cultural business of principled content moderation.
The problem for YouTube is that for rules to be taken seriously by the people they govern, they need to be applied consistently and clearly. And YouTube has done a very bad job of that. Crowder’s behavior was okay, until 12 hours later it wasn’t; Alex Jones was okay, until he wasn’t; clips of feminists being beaten up in a video game were okay, then they weren’t, then they were again. The company has a habit of tacking between empty, tedious literalism and broad, sweeping judgment as it moves back and forth in its moderation decisions. The legal formalism of a ruling that says because Crowder did not explicitly call for Maza’s harassment, he can’t be held accountable seems very distant from whatever school of interpretation allows the company to pronounce that he has caused “egregious harm to the broader community.”
This kind of inconsistency and lack of clarity makes it very difficult to treat YouTube’s intricate scaffold of policy — and its endless promises to “reexamine” that policy — as anything other than a cover for YouTube’s actual rule, which is: “Don’t do anything that attracts negative outside attention.” What seemed to ultimately get Crowder rebuked wasn’t that his case was kicked up to some higher court of appeals empowered to overturn precedent, but that his behavior (and YouTube’s initial ruling) had generated an enormous amount of anger — and negative press coverage.
The tension is a familiar one: Platforms like YouTube act (and generally “feel”) like liberal democracies, with nods to “free speech” and a quasi-judicial moderation process. But they’re corporate dictatorships that can and do moderate however they please, regardless of what their rules say. YouTube’s reexamination of its harassment policies might help it stop people like Crowder from taking advantage of its platform. But it won’t be because the rules are different — it will be because the company decides it doesn’t want the headache.