Connect with us

Technology

Facebook pulled 1.5 million New Zealand shooting videos in 24 hours

Published

on

Image: bob Al-Greene / mashable

We already knew that Facebook moved quickly on Thursday to stop videos of the New Zealand mass shooting from spreading, but now we have some actual numbers.

In a public statement and identical series of tweets dispensed by Facebook Newsroom, the company confirms that 1.5 million videos were removed in the first 24 hours following the terror attack on two New Zealand mosques that left 50 dead and 50 injured as of Sunday. Of those, 1.2 million were stopped before they were even uploaded.

The news and accompanying statement comes from Facebook exec Mia Garlick:

The statement continues: “Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content.”

It’s helpful to know just how widespread attempts at sharing the shocking video actually were (what’s wrong with y’all??), but Facebook’s limited look at the statistics here only paint a partial picture. It’s not clear, for example, how long the 300,000 videos that made it through were actually up, or how much engagement they saw during that time.

It’s also not clear what kind of repercussions await those who attempted to share the video. Facebook’s Community Standards lay out the rules for what kind of content is or isn’t kosher, but actions taken at the account level vary on a case-by-case basis and largely depend on context, such as how the video was shared and the user’s history.

Facebook did designate both of the shootings as terror attacks, which — again, under the site’s Community Standards — instantly placed restrictions on the way they could be talked about on the site. 

“[W]e do not allow any organizations or individuals that proclaim a violent mission or are engaged in violence, from having a presence on Facebook,” the standards read, running down a category that includes terrorists and terror organizations.

The rules go on to note: “We also remove content that expresses support or praise for groups, leaders, or individuals involved in these activities.”

So the moment Facebook labeled the shooting as a terror attack, some automatic rules governing what is and isn’t permissible on the site kicked in. Removing that initial stream also made it easier to hunt down other attempts to share it, since Facebook could use that data to more effectively chase down visually similar videos.

Still, there’s a massive difference between the written rules and the reality of how they’re enforced. The horrific events that occurred in New Zealand were streamed live on Facebook, and several hundred thousand copies of the video made their way to the platform before they were shut down.

That may be a relatively small number for a social network that counts its users in the billions, but it’s still an awfully high number given the added exposure it gave to the alleged shooters heinous crime.

All of which is to say: Facebook still has some serious explaining to do, and plenty more questions to answer. That stream never should have happened in the first place, for one. But more important than that is Facebook’s next move. 

What were the lessons learned here? What sort of bigger actions will be taken to prevent this from happening again? Facebook is as helpless as any of us when it comes to ending hatred and violence around the world, but Thursday’s events made it clear that the company needs to come up with better safeguards against its platform being used with evil intent.

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f90773%252f3d0c0f27 8427 4adf 94bb 12d6adfaf43d.jpg%252foriginal.jpg?signature=wu1br1m3jlljiyautjm4df2upmi=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Continue Reading
Advertisement Find your dream job

Trending