Close

Login

Close

Register

Close

Lost Password

Trending

How is Meta Taking Down Violent Content from its Platform?

In a piece of recent news, Meta has nearly doubled the amount of violent content it removes from Facebook. During the first quarter of 2022, the company took down 21.7 million pieces of content for breaking its rules around violence and incitement of violence, an increase from 12.4 million in the previous quarter.

Flagging Violent Content on Facebook

Takedowns were also up for the quarter on Instagram, but only slightly. The company removed 2.7 million violent content for breaking its rules around violence, up from 2.6 million during the last quarter of 2021.

The company shared the new metrics as part of its quarterly community standards enforcement report. In the report, Meta attributed the increase in takedowns to an “expansion of our proactive detection technology.” More than 98 percent of the posts it took down were removed before users reported them, according to the company.

The report comes at a moment when Meta is facing scrutiny for its response time following the recent mass shooting in Buffalo, New York. Live recordings of the shooting circulated on Facebook and other platforms and companies have been slow to take down all the new copies. One copy posted to Facebook was shared more than 46,000 times before it was removed more than nine hours after it was originally posted.

Developing Robust Measurements

As with prior mass shootings like Christchurch, the ability for people to quickly download and make new copies of live recordings has tested Meta’s ability to enforce its policies. “One of the challenges we see through events like this is people create new content, new versions, new external links to try to evade our policies [and] evade our enforcement,” Guy Rosen, Meta’s VP of Integrity, said during a call with reporters. “As in any incident, we’re going to continue to learn to refine our processes, refine our systems to ensure that we can detect we can take down violating content more quickly in the future.”

Meta also shared updated stats around violent content it mistakenly takes down. For violent content, the company said it eventually restored 756,000 Facebook posts that were appealed after they were initially removed. The company said it’s also “working on developing robust measurements around mistakes,” but didn’t explain what it would measure beyond restores of appealed content.

Share This Post

Like This Post

0

Related Posts

0
0

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Thanks for submitting your comment!

    RELEATED POST

    EDITOR PICKS