WhatsApp Child Porn Groups Exposed: Facebook's Chat App Allegedly Hosting Sex Abuse Images and Videos

WhatsApp logo
WhatsApp has been accused of failing to adequately combat the sharing of child sex abuse material on its platform. iStock

Facebook-owned messaging application WhatsApp has been accused of failing to adequately combat the sharing of child sex abuse material on its platform.

The Financial Times newspaper reported Thursday that one group chat recently active, titled "Kids boy gay," contained more than 250 members—some from the U.S. Participants inside the group were reportedly requesting "cp videos," referring to child abuse material.

Read more: Space agency NASA warns data potentially stolen in breach

The finding comes after two Israeli charities—Netivei Rishet and Screensaverz—warned Facebook about the spread of the material in September, the FT reported. The groups said initial requests for meetings with the local head of policy and communications at Facebook's Israel office, Jordana Cutler, went unanswered.

While WhatsApp has since shuttered the "Kids boy gay" chatroom, the FT reportedly found that several groups were still "extremely active"—protected in part by the app's tight-knit encryption, which shields the content of user messages even from the company itself.

Known as end-to-end (E2E) encryption, the technology is used in messaging applications to protect privacy and bolster security. Law enforcement has claimed it hampers the ability to catch criminals, but experts warn that weakening the system would leave everyone at risk.

"End-to-end encryption ensures only you and the person you're communicating with can read what's sent and nobody in between, not even WhatsApp," the company says on its website.

The technology means the same processes used to scan other applications and platforms for illegal material may be restricted on WhatsApp, in theory letting crimes take place undetected. Facebook uses a popular tool known as PhotoDNA to scan its vast social network.

The protocol helps tech companies find and remove known child exploitation images.

Professor Hany Farid, a developer of the PhotoDNA system, told the FT more work needs to be done by tech companies to combat the spread of abuse material. "You would think we could all just get behind this," he said. "The companies have done essentially nothing."

According to the charities, which reportedly managed to collect more than 800 videos and images from the group chats, Facebook said working with police may be their best option to combat the material. The charities reportedly filed a police complaint and contacted a politician. Due to Facebook's slow reaction, a report was compiled and sent to the FT, the newspaper said.

Zohar Levkovitz, an Israeli technology executive, served as an intermediary, it added.

Newsweek understands that WhatsApp was provided with only one sample of the recent child abuse material from the newspaper—after which it banned every group member. The group in question, roughly a day old, had already been flagged for review, the company indicated.

The company uses machine learning to scan unencrypted information on the platform, such as profile and group profile photos. It automatically compares those to the PhotoDNA banks. If anything is flagged, the company bans the uploader and every member of the chat room.

On Thursday, a WhatsApp spokesperson told Newsweek: "WhatsApp has a zero-tolerance policy around child sexual abuse. We deploy our most advanced technology, including artificial intelligence, to scan profile photos and images in reported content, and actively ban accounts suspected of sharing this vile content. We also respond to law enforcement requests around the world and immediately report abuse to the National Center for Missing and Exploited Children."

The statement added, "Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it."

For context, WhatsApp reportedly scrubbed 130,000 accounts relating to the sharing of illegal content in a recent 10-day period. The software boasts more than 1 billion users.

Local media outlets in India reported in August that members of an "international WhatsApp group" had been detained by police. The Times of India reported at the time that the chat room had contained 217 members and that the suspects arrested were mostly between 18 and 25.

As reported in April last year, a major law enforcement operation targeting the sharing of child sex abuse material on mobile apps led to 38 arrests across Latin America and Europe.

Dubbed Operation Tantalio—which was coordinated by Interpol and Europol—the probe was launched in 2016 after Spanish police found "dozens" of WhatsApp groups sharing illegal content. "These offenders are pushing the boundaries of modern technologies," warned former Europol Director Rob Wainwright at the time.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Jason Murdock is a staff reporter for Newsweek. 

Based in London, Murdock previously covered cybersecurity for the International Business Times UK ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go