Conspiracy Theorists Are Using YouTube To Deny the YouTube Shooting Ever Happened

Conspiracy theorists run amok on the internet, which will come as no surprise to anyone who has ever scrolled through Twitter, read the comments on political Facebook posts or stumbled upon some of the more bizarre subreddits.

However, perhaps more than any other platform, YouTube is the conspiracy theorists' playground. The site has repeatedly come under fire for lax content controls that allow creators to publish— and occasionally even monetize—alarming, defamatory stories under the guise of entertainment.

Conspiracy theorists now appear to be using YouTube's own website to deny that a shooting at the company's Bay Area headquarters ever happened. Others claim that Nasim Aghdam, the woman police believe shot three YouTube employees on Tuesday, was completely fake or a pawn used by the so-called "deep state," a political theory that promotes the idea that shadowy figures are secretly controlling the government.

As of Friday morning, dozens of videos promoted those claims, even as YouTube strengthens its efforts to remove potentially dangerous content.

"The shootings that we've recently had happen, they're all politically motivated. They're all part of the beast in the deep state," one vlogger stated in a video. "[Nasim Aghdam] is Iranian...Obama loves Iran; he's best friends with Iran. Iran is best friends with Russia. That's your PROOF that Obama is completely behind the destruction of America."

Other videos that claim Aghdam isn't a real person—that footage of her has been computer generated—continue to rack up views, with at least one nearing the 100,000 mark.

Like the most damning conspiracy theories, such videos latch onto a grain of truth to support their claim. This troubling trend has been described as "deepfakes." The term is frequently used in reference to pornography and it poses a real problem, from uses in revenge porn to spreading doctored images and videos of celebrities.

Nasim Najafi Aghdam video
A screenshot of one of Nasim Najafi Aghdam's videos. Aghdam shot three people at the YouTube campus and was said by her father to have "hated" the video platform. Nasim Najafi Aghdam

"Look at the way her face moves..." one blogger said, using a clip of an NBC-affiliated video. "It doesn't look real."

"They have the technology as we seen with Snap Chat," another said. "I don't believe this is a real person, man. I believe they may have a real person's body, but they put a CGI face on it."

Dr. Gordon Pennycook, who researches the proliferation of fake news at Yale University, told Newsweek that this type of content can be harmful, particularly when it's weaponized to derail political movements or bolster political ideologies that would otherwise prompt more skepticism. This is especially true in the wake of the Parkland, Florida shooting, he said.

"The problem in this domain is that people are intensely partisan," Pennycook said. "They believe these things because it coincides with what their ideology says. It's mostly a matter of lazy thinking, especially on social media, where people are inattentive. That can be especially true for claims that are surprising."

Like Parkland, the gun used in YouTube shooting was purchased legally, according to police. That news comes amid a groundswell of support for Congress to pass gun restrictions. Pennycook said that could influence the creation of conspiracy videos.

"The reason why it's particularly bad in this case, is because there's more imperative for them to discredit (the true events)," he said. "The stronger the threat to the individual's views, the stronger the response has to be."

YouTube
Dado Ruvic/Reuters

YouTube could not immediately be reached for comment. In a study on social media conducted shortly after the 2016 general election, researchers at MIT found that stories containing false information, some of which contained conspiracy-type information, spread faster than counterparts with accurate details.

"Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information," the report stated.

Short of companies increasing restrictions on online content, or hiring more people to vet content, Pennycook said the best thing to prevent the spread of conspiracy theories is to remain active consumers of information.

"Some people are kind of passively accepting it, and it's those people that we have to worry about. And of course, those people are the most valuable [to conspiracy theorists] because they aren't thinking critically enough."

He continued, "that should be remediated, but we haven't figured that out yet."

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go