At the end of January, YouTube announced it will “begin reducing recommendations of borderline content and content that could misinform users in harmful ways.” It cited videos offering a “phony miracle cure” or declaring the Earth flat or lying about 9/11 as examples of content that fit this criteria. That’s great! Absolutely. If somebody is watching a genuinely educational video about the measles vaccine, the very next thing they are served shouldn’t be an hours-deep rabbit hole of clips disseminating BS about the alleged dangers of vaccinating children against a disease they don’t need to be susceptible to in the year 2019.
The change got some press over the weekend after a former Google engineer who helped build the AI responsible for serving related videos to YouTube watchers tweeted a thread calling the change a “historic victory.” Guillaume Chaslot describes a fictional person named Brian who, depressed following a death in his family, spends a lot of time watching YouTube videos. This quickly leads him to conspiracy-theory videos, which Brian begins watching more frequently, triggering the AI to show him more of the same. This increases views on those videos, which prompts creators to make similar content. It’s a vicious circle Chaslot says “slowly biases YouTube.” Chaslot also mentioned a man in Seattle who stabbed his brother to death with a sword because he believed his brother was a “lizard.” The man’s YouTube presence involved an array of far-right conspiracy theories.
YouTube’s announcement is certainly good news. But it’s also as late as it is good. I’m thinking here about people like David Hogg, who survived the Parkland school shooting that left 17 of his classmates and teachers dead, only to have to endure viral videos peddling a conspiracy that he was not a high schooler, but rather a paid crisis actor. Following the shooting, the video of Hogg spiked to the No. 1 trending spot on YouTube before the platform finally took it down. A different video from the same user purporting to show Hogg “forgetting his lines” was left up even after the other video was removed by YouTube. (It has since also been deleted.) This week, Valentine’s Day will mark one year since the Parkland shooting.
Chaslot wrote that YouTube has two options when it comes to curbing conspiracy-theory videos: that “people spend more time on round earth videos” or that the company “change the AI.” “YouTube’s economic incentive is for solution 1,” he continued. “After 13 years, YouTube made the historic choice to go towards 2.”
It feels like we’re giving YouTube way too much credit here. YouTube didn’t have to give Alex Jones, a man who claims the shooting at Sandy Hook didn’t happen, a platform for as long as it did. (YouTube finally banned Jones in August 2018.) Just like (before January) it didn’t have to let people continue posting scientifically debunked schlock about how vaccines cause autism just because those videos technically weren’t violating the rules. The company isn’t going with option two at great cost to its bottom line. The company is going with option two because the cost of people calling it out for going with option one for so long is becoming untenable.