Google: Youtube Hires Counterterrorism Experts to Help Police Website's Videos

YouTube has hired full-time counterterrorism specialists to help police its content as part of its pledge to bulk up moderator staff numbers to 10,000 across Google by the end of 2018, the U.S.-based hosting service has said.

For the first time, it published a quarterly report this week detailing the progress made on purging unwanted or illegal content from its website using a cocktail of machine learning technology and human judgment. Between October and December 2017, the website removed more than 8 million videos, it said.

Material removed included included a video calling for the murder of gay people, a video promoting the activities of Islamic State militant group while endorsing it as the most valid option in Syria and a video "purporting to give advice about how to prevent Jewish players from participating in a popular video game."

YouTube promised to bulk up its staff in December last year after coming under fire for hosting alleged child exploitation material. Research first highlighted by Buzzfeed News the month prior exposed how content showing kids in distress—including in live-action roleplays featuring "gross out themes" such as eating feces, being subjected to injections or fake kidnappings—had gained millions of clicks.

More recently, CNN reported how YouTube was facing a backlash for running advertisements for major brands, including Facebook, LinkedIn and Netflix, on channels promoting Nazis, white supremacy, propaganda and pedophilia.

YouTube has tackled the issue of terrorism for years, and in June 2017 said that it would make changes to further crack down on "videos that contain inflammatory religious or supremacist content." In its report this week, the site said that most of the promised employment positions designed to help had been filled.

"We've staffed the majority of additional roles needed to reach our contribution to meeting that goal," it said in a blog post accompanying the in-depth analysis. "We've also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we've expanded regional expert teams," it added.

It said that machine learning technology—introduced in June 2017— had helped to flag content on a wider scale than ever before. Of the 8 million videos removed between October and December last year, 6.7 million were flagged for review by machines rather than human eyeballs. Of those 6.7 million, 76 percent were removed before they received a single view, the platform stated.

YouTube has over 1 billion users and each day those users watch a collective total of a billion hours of video, the firm has said. The Guardian previously reported that a whopping 300 hours of video is being uploaded to the website every minute.

YouTube
This photo illustration taken on March 23, 2018 shows YouTube logos on a computer screen in Beijing. Can the website stop rogue videos from being uploaded to its platform? NICOLAS ASFOURI/AFP/Getty Images

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Jason Murdock is a staff reporter for Newsweek. 

Based in London, Murdock previously covered cybersecurity for the International Business Times UK ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go