Connect with us

Technology

Facebook’s Oversight Board makes bizarre ruling in its first group of decisions

Published

on

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f95914%252f1d613c90 bfe8 47b7 ad8f 580a88d3f83c.png%252f930x520.png?signature=mbmc1tmh4z2pspsfmck77yr6vdc=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Facebook’s Oversight Board has officially chimed in on its first five cases — and the rulings are certainly interesting.

The Oversight Board chose to overturn Facebook’s decision to remove content in four out of the five cases. As a result, Facebook must restore those four posts.

The most bizarre from the board involved a post that was flagged as “anti-Muslim hate speech.” A user from Myanmar posted a picture of a Syrian toddler who drowned while trying to reach Europe in 2015. Along with the photo, they included a comment which Facebook translated as, “[there is] something wrong with Muslims psychologically.”

While Facebook removed this post under its Hate Speech Community Standard, the board ruled to reverse this decision and restore the content. According to the board, its own translators claimed the phrase more accurately translated to “®hose male Muslims have something wrong in their mindset.”

Experts have put some blame on Facebook for the spread of anti-Muslim rhetoric in Myanmar. However, according to the Oversight Board,  “…while hate speech against Muslim minority groups is common and sometimes severe in Myanmar, statements referring to Muslims as mentally unwell or psychologically unstable are not a strong part of this rhetoric.” 

It seems like the Oversight Board ignores the photo of the child, other than to recognize it will be reposted with a warning label as per Facebook’s Violent and Graphic Content Community Standard policy. 

When the text is put into context alongside the picture, the post does appear to be dehumanizing a group of people for the crime of…trying to the civil war in Syria and ISIS.

Eric Naing, a spokesperson for the civil rights group Muslim Advocates, provided an emailed statement on the ruling to Mashable:

“Facebook’s Oversight Board bent over backwards to excuse hate in Myanmar—a county where Facebook has been complicit in a genocide against Muslims. It’s impossible to square Mark Zuckerberg’s claim that Facebook does not profit from hate with the board’s decision to protect a post showing images of a dead Muslim child with a caption stating that ‘Muslims have something wrong in their mindset.’ It is clear that the Oversight Board is here to launder responsibility for Zuckerberg and Sheryl Sandberg. Instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility to a third party board that used laughable technicalities to protect anti-Muslim hate content that contributes to genocide.”

The other decisions made by the Oversight Board appear fairly straightforward. Take, for example, the the board reviewed. In October 2020, a user a quote falsely attributed to Nazi Germany’s Minister of Propaganda Joseph Goebbels. Facebook removed the post. However, the user argued that they posted the quote in order to criticize then-President Donald Trump, not to disseminate hateful material.

The board ruled in favor of the user, ordering Facebook to restore the post. The decision was based mostly on these two findings: The user was telling the truth about the quote being used to criticize Trump, not promote a Nazi. In addition, the board determined that Facebook did not make its policies about who qualifies as a “dangerous individual” clear. 

Another interesting piece of evidence the Oversight Board used: comments made on the post by the user’s friends. According to the board, those comments made it clear that the quote was being used to criticize Trump. 

The board admonished Facebook for not providing users with a list of examples that fall under its Dangerous Individuals and Organizations Community Standards policy. While the Board’s ruling can only make Facebook restore the post, it also suggested that it update this policy so users know who and what is designated as “dangerous.”

In addition to that case, the Oversight Board a Facebook decision to remove a post in France that the company claimed fell under its COVID-19 misinformation policy. The board ruled that the post was more of a critique of government policy than a call for Facebook users to take a possibly harmful medication. 

The Oversight Board also on a case that Facebook had already reversed itself. (It restored a post that was removed by its automated moderation system.) A Brazilian user’s breast cancer awareness post was removed from Instagram for showing female nipples. Although Facebook restored the image before the case made its way to the board, the Oversight Board still wanted to making a ruling on it. Oversight Board rulings provide the user with an explanation of what happened, which the board thought was important. The board also suggested that Facebook make changes to the way its automated content moderation is used.

The one where the Oversight Board did uphold Facebook’s decision to remove content involved a post containing a slur against the people of Azerbaijan. The board found it fell under the company’s Community Standard on Hate Speech and was used to dehumanize Azerbaijanis.

The Oversight Board is an independent entity tasked with ruling on individual content cases on Facebook’s social media platforms. While it also suggests broader policy changes, only its individual content decisions are binding. 

The board is made up of 20 members, including a human rights lawyer, a former prime minister, and an executive at a right-wing think tank. Users can appeal to the Oversight Board after exhausting review requests for content takedown decisions on Facebook and Instagram. 

Recently, Facebook the Oversight Board with providing a final ruling on whether Donald Trump’s ban from the platform is permanent.

Will the Oversight Board overturn Facebook’s decision and bring Trump back to the platform? It’s unclear when they’ll hand down the decision. Although, it does seem pretty clear that based on these five cases that the ruling could go either way.

Continue Reading
Advertisement Find your dream job

Trending