IE 11 is not supported. For an optimal experience visit our site on another browser.

Facebook credits automated improvements for finding violent content

The company said it took action against 13.6 million pieces of violent content in the third quarter of this year.
Image: People visit a newly unveiled logo for "Meta" sign outside Facebook headquarters in Menlo Park on Oct. 28, 2021.
People visit a newly unveiled logo for Meta outside Facebook headquarters in Menlo Park, Calif., on Oct. 28.Noah Berger / AFP - Getty Images file

Facebook’s parent company said in a report released Tuesday that it had taken action on 13.6 million pieces of content that depicted or incited violence on the platform during the third quarter of the year.

Meta, the newly rebranded company that includes Facebook, Instagram, WhatsApp and Oculus, took similar actions against more than 3 million instances of content on Instagram, the company said in its Community Standards Enforcement Report.

Facebook takes a variety of actions against content that violates its policies, including removing it, adding warnings before it can be viewed and disabling accounts. The report did not break out how many of those instances were removals, and it did not separate those instances by language or geographic region. 

Tuesday’s announcement, which marks the first time that Meta has disclosed figures about violence, comes about two weeks after NBC News and other news outlets, reported extensively on leaked documents that detailed Facebook’s challenges handling extremism on its platform and growing discontent among some of its employees.

The company said Facebook’s automated systems caught more than 96 percent of violent content before it was reported by users. Meta’s report also said violent content constituted 0.04 percent of content viewed on Facebook. The latest report covers June through August. 

“This is our 11th report, which shares more data and information than any of our peers in the industry,” Guy Rosen, Facebook’s vice president for integrity, said in a call with reporters.

Meta also released figures for the first time about the prevalence of “bullying and harassment” content on Facebook, saying it accounted for around 0.14 percent of content viewed during the third quarter. 

Meta officials said the company’s algorithm-based internal tools to find bullying content before users do have improved over the last two years. 

In addition, Meta has also continued to work against militia-related and QAnon-related material on Facebook since August 2020, and it provided new figures about such content for the first time since January. 

Monika Bickert, Facebook’s head of global policy management, said that to date the company removed nearly 55,000 “militarized social movement” profiles, more than double the number the company released 10 months ago. 

Similarly, it has removed over 50,000 QAnon-related profiles, nearly triple the figure of 18,300 released in January.