Connect with us

Technology

YouTube employees warned about its ‘toxic’ video problems, but the company ignored them

Published

on

YouTube employees warned the company about its viral disinformation and extremist content problem for years, according to a recent report.
YouTube employees warned the company about its viral disinformation and extremist content problem for years, according to a recent report.

Image: Aytac Unal/Anadolu Agency/Getty Images

Over the past couple of years, has taken steps to tackle one of the company’s biggest challenges: the rise of toxic content on the platform.

Just recently, the company addressed misinformation with products like , which fact check certain video search results. Anti-vaccination videos, which could harm the public, have been . YouTube has even promised to address its long-criticized product, so that it would stop actively promoting extremist and conspiratorial content.

There’s no doubt that YouTube is taking platform safety more seriously now than ever before. YouTube certainly to know that. However, a report from now shines a light on how YouTube was consistently warned about these problems by its employees well before it decided to address them. And while YouTube stresses how it’s centered on these issues over the past two years, one such rejected proposal could have helped stifle the spread of Parkland shooting conspiracies just last year.

According to former YouTube employees who spoke to Bloomberg, the company was repeatedly warned about toxic content and misinformation on the service but pushed the concerns aside to focus on the growth of the platform. As recently as February 2018, YouTube employees proposed a solution to limit recommended videos to legitimate news sources in response to conspiracy theory videos calling the Parkland shooting victims “crisis actors.” According to Bloomberg, the proposal was turned down. 

Some of the former senior level YouTube employees even cited the spread of this type of content as the reason they left the company.

One early YouTube employee, who had worked there before Google the video site in 2006, explained how the site had previously moderated and demoted problematic videos, using content that promoted anorexia as an example, He pointed out how things seem to have changed once Google came along and prioritized engagement.

With this push to grow engagement and revenue along with it, toxic videos took advantage of the changes. The problem became so well known that, according to Bloomberg, YouTube employees had a nickname for this brand of content: “bad virality.”

Concerns over videos skirting the company’s hate policies, the recommendation engine pushing disinformation, and extremist content being promoted were effectively ignored. Proposed changes to policies to address these issues were also turned down. The company went so far as to tell staff not on the moderation teams to stop looking for problematic content to flag.

As YouTube notes in its response to Bloomberg’s report, the company has begun to take these issues more seriously. YouTube has been especially to toxic content relating to children. The company has even instituted policies similar to the proposal introduced following Parkland when it comes to .

The recent changes that are ongoing at YouTube are undoubtedly a good thing for the future. But it’s pretty clear that so much could have been done even sooner. 

Unfortunately, that is already done.

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f90247%252f9806b93d 820e 4fd3 b9f7 4915142feaec.jpg%252foriginal.jpg?signature=0fybz6r8oxopjp9bhr8hkhlusxa=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Continue Reading
Advertisement Find your dream job

Trending