YouTube Hiring 10,000 Moderators To Police Inappropriate Content

Rengar

Level 17
Thread author
Verified
Top Poster
Well-known
Jan 6, 2017
835
Google-owned YouTube is to hire thousands more moderators to try and clean up its content.
The new staff will be responsible for trying to get rid of videos that are offensive, violent, terrorism related, or just plain inappropriate for children.

YouTube’s new policy comes after months of sustained accusations that children have been routinely viewing inappropriate content that was completely unsuitable for their age group.

Recent examples of such videos include Disney characters firing machine guns at other popular characters, and the cartoon character, Peppa Pig, “basically being tortured” at the dentist.

youtube.png

This is not an example of inappropriate content. This is the CEO of YouTube, Susan Wojcicki

Money talks
Despite the public outcry, Google only seemed to react after companies started pulling their advertising from the video sharing site. But either way, the company says it is now recruiting thousands of reviewers to reduce the amount of “problematic content” on its video platform.

CEO Susan Wojcicki
“We need an approach that does a better job determining which channels and videos should be eligible for advertising”, said YouTube CEO Susan Wojcicki admitted in a blog post.

“As the threats to our platform will evolve and change, our enforcement methods must and will evolve to respond to them,” she continued. “But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering. We will take the steps necessary to protect our community and ensure that YouTube continues to be a place where creators, advertisers and viewers can thrive.”

The company said that its renewed efforts would further work to protect children from dangerous and abusive content, and block hate speech.

Humans and AI
Wojcicki said that in addition to employing humans, YouTube would continue to develop AI to automatically search and flag contentious, offensive, and inappropriate content for removal.

While the hiring of thousands more real people specifically to try and make YouTube a safer place, has been widely welcomed by the wider media, Wojicki also said YouTube’s machine learning algorithms were responsible for blocking and deleting 98% of videos related to violent extremism.

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” Wojicki also said. Human moderators were responsible for having reviewed nearly two million videos for violent extremist content in the past six months alone. By way of comparison, Wojcicki said 180,000 people would have had to work 40 weeks to assess the same amount of content.

One day, perhaps, machines will be able to filter out all inappropriate content, but until then, YouTube’s new army of moderators will hopefully be able to take up the slack.
 

RejZoR

Level 15
Verified
Top Poster
Well-known
Nov 26, 2016
699
This is not an example of inappropriate content. This is the CEO of YouTube, Susan Wojcicki
I am still laughing

It is. She's an idiot. And Youtube is an absolute dumpster fire. Only reason I'm still hanging around it is because of few creators who are kinda cool. Otherwise I couldn't care any less if Youtube disappears over night.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top