In preparation for possible unrest related to the election, Facebook is planning for a potential rollout of internal tools designed to slow the spread of misinformation in “at-risk” countries, according to a new Wall Street Journal report.
While executives have stated that they would only enact such a plan if something as serious as election-related violence were to occur, the measures reportedly include “slowing the spread of viral content and lowering the bar for suppressing potentially inflammatory posts,” as well as
“tweaking the news feed to change what types of content users see.”
In the wake of the platform’s decision to slow the spread of spuriously sourced New York Post reports on Hunter Biden’s business connections in Ukraine, a potential platform-wide action to slow misinformation would certainly amplify Republican complaints of censorship. In a company-wide meeting reported last week by BuzzFeed News, CEO Mark Zuckerberg told employees that Facebook would slow the pace of its interventions after November. He also added that “a decisive victory from someone” could “be helpful for clarity and for not having violence or civil unrest after the election” — which could reduce the likelihood of the platform’s need to enact its emergency measures. Facebook has previously deployed the emergency measures in south and southeast Asia, after it was found by the United Nations to have fueled violence against Muslims in Myanmar and was reportedly used to organize violence against Muslims in Sri Lanka.
The threat of post-election violence in the United States is not hypothetical. The recent domestic-terrorist plot to kidnap Michigan governor Gretchen Whitmer was planned in part on Facebook. And according to a review of far-right groups by the Armed Conflict Location & Event Data Project and MilitiaWatch, researchers found “that groups are organizing to ‘supplement’ the work of law enforcement.” As a whole, the motivations of the militia groups are moving “from anti-federal government writ large to mostly supporting one candidate.” That candidate is, unsurprisingly, the same one that refused to disavow a far-right group during the first presidential debate.
In the midst of an election, a pandemic, and a summer of unrest, Facebook has taken unprecedented steps to limit the spread of misinformation and the organization of violent movements on its platforms. In June, it banned hundreds of accounts and pages associated with the loosely organized, anti-government Boogaloo movement. The next month, Facebook deleted a video shared by the president which claimed that face coverings are ineffective and that hydroxychloroquine cures COVID-19. By September, Mark Zuckerberg had said that the platform would discourage premature election-victory claims. In October, Facebook had banned all QAnon pages, groups, and Instagram accounts, after the conspiracy theory amassed millions of followers on the platform over the previous three years. Last week, Facebook removed 48 Trump ads that featured the phrase “your vote has not been counted” for violating its voter-interference policy.