Flat Earth, Illuminati, and Fake Moon Landing: Are Conspiracy Theory Videos Hurting YouTube?

YouTube logo screen
The YouTube and Netflix app logos are seen on a television screen on March 23, 2018 in Istanbul, Turkey. YouTube has drawn criticism for its role in spreading misinformation and conspiracy theories, but the platform... Chris McGrath/Getty Images

Louie Veleski has some interesting opinions. He thinks ghosts exist and humans have never been to the moon. A resident of Melbourne, Australia, Veleski expounds on his points of view on his YouTube channel, Better Mankind, which earns him up to $5,400 a month.

Conspiracy theories, it turns out, are very profitable for the YouTube-inclined entrepreneur. On his channel, Peladophobian, Ryan Silvey, 18 and also from Australia, posts videos like "School Is Illuminati" and "Donald Trump Is Vladimir Putin." Though satirical, the videos may be lumped in with other contrarian or esoteric posts in search results. Silvey makes more than $7,500 a month on average from advertisements that some of his 628,000 subscribers view.

YouTube also makes a bundle. About 55 percent of the money companies pay to put their 30-second ads at the start of popular videos goes to the content creators. The rest goes to Alphabet, the site's parent company. It reported more than $110 billion in revenue in 2017 (up from $90 billion in 2016). Nearly 90 percent of that figure came from ads, and a growing number were on YouTube.

Created in 2005, YouTube is the internet's dominant video content platform. People around the world watch about 1 billion educational videos on the site each day, and more people are using it as a news source. But media reports have implicated YouTube in the spread of fake news and extremism, often on account of conspiracy videos touting false information. With Facebook now under government scrutiny and possibly facing regulation, YouTube is taking measures to ensure its own integrity. And that could mean the end of the conspiracy video business.

Concern about these videos could seem overblown. Take a post claiming a geomagnetic storm on March 18 would "[disrupt] satellites, GPS navigation and power grids across the planet." Some news outlets took the claim as fact until U.S. scientific agencies refuted it. That video was misleading but likely harmless.

But others may have played a part in recent tragedies. The person who drove a car into pedestrians on London Bridge in June 2017 and stabbed patrons in nearby bars may have watched videos from a Salafist preacher on YouTube. After the rally last August in Charlottesville, Virginia, by the so-called alt-right, The New Republic called the platform "the Worldwide Leader in White Supremacy." After the Las Vegas shooting in October 2017, The Wall Street Journal caught the algorithm suggesting videos claiming the event was a false flag. Until the algorithm changed, the top five results for a search about "Las Vegas shooting" included a video claiming government agents were responsible for the attack.

"From my experience, in the disinformation space," wrote Jonathan Albright, the research director at the Tow Center for Digital Journalism, in an essay on Medium, "all roads seem to eventually lead to YouTube."

Addressing the problem is tricky because what constitutes a conspiracy isn't always clear, says YouTube. Do predictions for 2018, including that Italy's Mount Vesuvius will erupt and kill hundreds of thousands of people, count? What about Shane Dawson, who routinely posts videos on his channel but doesn't necessarily endorse what he discusses? One video that posits, among other things, that aliens may be related to the disappearance of Malaysia Airlines Flight 370, began with the disclaimer that "these are all just theories," and "they're not meant to hurt or harm any company."

The difficulty of pinpointing whether or not a post qualifies as a baseless, fringe view is part of the issue. Without a definition, YouTube's algorithm can't filter out such videos from its search results. That's a problem for Alphabet, which is afraid that the spread of conspiracy videos across YouTube could backfire. False information seeping into top recommended video lists could eventually drive customers—anyone who watches YouTube videos—away. "Our brands may also be negatively affected by the use of our products or services," Alphabet's 2017 annual report stated, "to disseminate information that is deemed to be misleading."

HOR_YouTube_01_a1618s2964_HORIZONTAL
Brian Stauffer/theispot

Yet the site incentivizes content creators to wander close to the extreme-views edge because they entice users to click. That video by Dawson about the disappeared plane garnered 8 million views, likely earning him—and Alphabet—thousands of dollars. AlgoTransparency, a website that tracks what videos YouTube recommends to visitors, notes that searching for the phrases "Is the Earth flat or round?" or "vaccine facts" in February led to videos claiming to show proof the Earth is flat or evidence that vaccines cause autism, respectively, about eight times more often than videos without a conspiracy bent on these subjects. When Veleski began producing conspiracy-type videos, he received more views—and more money—for them than for those focused on alternative medicine and health topics.

YouTube has some radical views of its own. In January, the site announced that videos on controversial topics like chemtrails (condensation left by airplanes that some people think is dangerous chemicals) would no longer be eligible to run ads. And later this year, panels will accompany any video on a topic surrounded by conspiracy theories, such as the moon landing or John F. Kennedy's assassination. These pop-ups will have supplemental information from third-party sources like Wikipedia (the company declined to name other potential sources).

Veleski isn't looking forward to the change. As he sees it, the encyclopedia-based panels will denigrate what many people consider to be legitimate, if controversial, perspectives on important topics. "To make a topic look silly because it's not mainstream," he says, "I don't think it's entirely fair."

When it comes to true believers, though, the strategy to post facts alongside these videos might not work anyway. Jovan Byford, a researcher at the Open University in the U.K., points out the flaw in using rational arguments to debunk conspiracy theories. "That doesn't work," he says. "Their argument to that will be: Well, that's what they want you to believe."