As Anti-Vax YouTube Channels Fueled New Conspiracies, Employees Say They Were Told To Look Away

YouTube executives repeatedly declined proposals to curb the proliferation of conspiracy, propaganda and other toxic video content over a period of years, according to a new investigation of YouTube by Bloomberg's Mark Bergen. Rather than confront algorithmic recommendations pushing extreme right-wing and conspiracy theory videos, YouTube told employees not to look for questionable videos, in order to maintain plausible deniability regarding editorial oversight.

New story: I spent weeks talking to folks who've worked at YouTube (and Google) about how the company has wrestled with recommendations, conspiracy theories and radicalism. https://t.co/FHmpHPyaz3

— Mark Bergen (@mhbergen) April 2, 2019

Beginning in 2012, YouTube began focusing on engagement, setting an ambitious goal of one billion hours of video viewing daily. The recommendation engine was rewritten to drive toward this goal, nudging viewers toward additional content by recommending new content at the end of every video. YouTube hit its billion hours viewing goal by October of 2016.

More than twenty current and former Google employees described to Bloomberg how the overwhelming focus on engagement resulted in an unwillingness from YouTube executives, including YouTube CEO Susan Wojcicki, to take steps against the dissemination of extreme and false content. Because about 70 percent of total user engagement on YouTube is generated by recommendations, YouTube executives were reluctant to consider major adjustments to the recommendation algorithm, instead preferring extraneous measures like info boxes linking to Wikipedia articles countering false videos, including moon landing trutherism and anti-vaccine pseudoscience.

81% of YouTube users say they watch recommended videos - with 15% saying they watch these videos regularly https://t.co/w5JP73gh4B pic.twitter.com/oRgdFHFupB

— Pew Research Internet (@pewinternet) April 1, 2019

The results have been predictable, with even staid political content frequently followed by recommendations pushing viewers toward alt-right and conspiracy theory material like 9/11 Truth and Pizzagate videos. Moonshot CVE, a London-based group countering extremism online, found that fewer than twenty YouTube channels have spread anti-vaccination conspiracy theories to more than 170 million viewers. After watching, the YouTube recommendation algorithm sent viewers to additional conspiracy theory videos.

Google did not immediately respond to questions from Newsweek, but this article will be updated accordingly.

"What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with—or to incendiary content in general," technological sociologist Zeynep Tufekci writes in a 2018 New York Times editorial. "YouTube may be one of the most powerful radicaling instruments of the 20th century."

Bloomberg also found evidence of internal dissension, which was frequently rebuffed by Wojcicki—who would "never put her fingers on the scale," one source said—and other executives, who rejected proposals like removing specific videos from being automatically recommended to viewers. Many YouTube employees were particularly upset after an in-house postmortem on YouTube's 2016 election coverage revealed the site's list of top election-related videos was dominated by extreme right and conspiracy theory outlets like Breitbart and conspiracy theorist Alex Jones' InfoWars.

An anonymous former executive told Bloomberg the hands-off approach from executives lead to an internal policy of intentional ignorance, with lawyers verbally telling YouTube employees to avoid looking for conspiracy theory and false information disseminating videos. Never put in writing, the policy was meant to shield YouTube from liability by preempting pressure to exercise more editorial control over video content. As Bergen puts it, paraphrasing the former Google executive, "If YouTube knew these videos existed, its legal grounding grew thinner."

Employees continued to push back internally, disseminating evidence that a theoretical alt-right category would rival YouTube's music, sports and gaming channels for viewership, demonstrating how central extreme content had become to recommendation-driven engagement and YouTube's profitability.

Employees also pushed back against another plan to re-center YouTube payouts to creators around engagement metrics which would have rewarded outrage. The plan could have made Alex Jones—currently the target of a lawsuit brought by parents of children killed at Sandy Hook Elementary for his claims that the mass shooting was an elaborate hoax—one of the highest paid YouTube stars overnight (Jones was banned in 2018). The proposal was ultimately rejected by Google CEO Sundar Pichai.

In February, YouTube described new steps taken to combat the spread of lies and conspiracy theories, mostly recently creating new restrictions for "borderline content," removing certain videos from the recommendation system and focusing on "responsible growth" over engagement, though how this new metrics are applied to the current algorithm remains unclear.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer



To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go