YouTube to hire 10,000 moderators in major clampdown on ‘bad actors’

6 Dec, 2017 01:39 / Updated 7 years ago

YouTube will dedicate more than 10,000 staff to tackle “bad actors” who are using the Google-owned video hosting platform to distribute extremist and disturbing content, as well as to “mislead, manipulate, harass and harm” viewers.

“In the last year, we took action to protect our community against violent or extremist content, testing new systems to combat emerging and evolving threats,” YouTube CEO Susan Wojcicki said in an article in the Telegraph.

The video sharing website overhauled its policies to restrict what type of content can appear while investing heavily in machine learning technology which takes down videos and comments that violate its policies.

Despite the heavy investment in technology, the site still relies overwhelmingly on humans to make the call on whether something is offensive or not. In the past six months, Youtube’s “trust and safety” teams have reviewed nearly two million videos. More than 150,000 videos featuring inappropriate content were removed during that time.

Apart from moderating videos, Wojcicki said the company is also trying to clean up YouTube’s comment section which is routinely slated as one of the most toxic places on the Internet.

“We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether,” she said.

The video hosting giant is also clamping down heavily on disturbing material aimed at children. Last month, it “expanded its enforcement guidelines” concerning the removal of content featuring minors “that may be endangering a child, even if that was not the uploader's intent," according to Johanna Wright, vice president of product management.

YouTube further said that it had, since June, been removing ads, allowing YouTubers to capitalize on videos “depicting family entertainment characters engaged in violent, offensive, or otherwise inappropriate behavior.”