November 14, 2019
Facebook said it removed tens of millions of posts that broke its rules regarding child pornography, hate speech and harassment in a Wednesday transparency report.
The tech giant said the report reflects its efforts to combat harmful practices that take place on its platform after the social media site received scrutiny for not doing enough to police child predators and hate speech in a Wednesday press release.
“We’ve made progress in important areas such as hate speech, child exploitation imagery, and posts about illegal drugs and firearm sales but we still have more to do. We’ll continue to invest in people and systems to combat harmful content on our platforms,” Facebook’s official Twitter account wrote in a Wednesday tweet.
We’ve made progress in important areas such as hate speech, child exploitation imagery, and posts about illegal drugs and firearm sales but we still have more to do. We’ll continue to invest in people and systems to combat harmful content on our platforms.
— Facebook (@facebook) November 13, 2019
Facebook took action against a total of 20.8 million pieces of content related to suicide and self-injury, child nudity and sexual exploitation, and drugs and firearms sales in the third quarter of 2019, according to the release.
Of those 20.8 million pieces of content taken down in the third quarter of 2019, 2.5 million posts related to suicide and self-injury were removed; 11.6 million pieces of content related to child nudity and sexual exploitation of children were removed; and 6.7 million pieces of content relating to drug and firearms sales were removed — all of which were proactively detected more than 93% of the time.
Facebook founder Mark Zuckerberg said during a Wednesday conference call that while some people might interpret the large content removal numbers Facebook reports as an indication it has “so much more harmful content” than other platforms, the numbers actually show the tech giant’s efforts to work “harder to identify this and take action on it” than other sites, according to Washington Post tech reporter Tony Romm.
Zuck not done taking subtle shots at FB/Twitter. Some ppl think "bc we’re reporting big numbers that much mean theres so much more harmful content happening on our services than others… What it says is we’re working harder to identify this and take action on it"
— Tony Romm (@TonyRomm) November 13, 2019
Facebook has taken major strides to increase its transparency regarding the content it removes and removal policies since a Sept. 28 New York Times investigation found that encryption technology gives protection to sexual predators to who engage in illicit behavior with children online and makes it more difficult for law enforcement to track such criminals.
The report found child abusers allegedly used Facebook Messenger in nearly 12 million instances of sexual abuse material out of 18.4 million instances that were reported, according to The NYT, citing people who studied the reports.
“The reason why the vast majority come from Facebook is because we work harder than any other company to identify this behavior,” Zuckerberg said when the report was brought up during an Oct. 26 hearing in front of the House Financial Services Committee regarding Facebook’s cryptocurrency project.
Zuckerberg continued to explain that Facebook builds “sophisticated systems” to identify child sex abusers on the platform, adding that he doesn’t “think Facebook is the only place on the internet where this behavior is happening. I think the fact that the vast majority of those results from us reflects the fact we actually do a better job than anyone else of finding it and acting on it.”
This article was posted: Thursday, November 14, 2019 at 5:01 am