1,000 Facebook Workers Are Paid to Read Supposedly Private WhatsApp Messages: Report

An investigation released Tuesday claims Facebook employs more than 1,000 workers who reportedly read through millions of messages on the data-encrypted WhatsApp messaging service.

ProPublica produced the report that looked into Facebook's management of WhatsApp, a global messaging subsidiary with around two billion users that is promoted for being an especially private network.

Social media logos
A new report claims Facebook workers read through supposedly secret WhatsApp messages. In this image, a Facebook symbol is displayed behind the logos of WhatsApp, Messenger and Instagram displayed on the screen of an Apple... Chesnot/Getty

After Facebook acquired WhatsApp for $19 billion in 2014, Facebook CEO Mark Zuckerberg promised users that their data would remain safe and unscreened by the company.

Contrary to his pledge, ProPublica's report said that Facebook has hired more than 1,000 contractors in Texas, Ireland and Singapore to look through users' content.

Zuckerberg once cited WhatsApp's end-to-end encryption as a feature he was planning on bringing to Instagram and Facebook Messenger. The encryption is said to make all messages on the app unreadable until the message reached its intended recipients. Before a user sends a message, a flag appears on the screen that reads, "No one outside of this chat, not even WhatsApp, can read or listen to them."

WhatsApp has also been promoted as being so secure that not even its parent company can open the messages. Indeed, Zuckerberg said during testimony to the U.S. Senate in 2018, "We don't see any of the content in WhatsApp."

Yet, ProPublica's report said that contractors are hired specifically for reading private messages, as well as viewing images and videos, that WhatsApp users have reported as being improper.

According to the report, the workers then determine whether the reported content should be classified as fraud, illegal pornography, terrorist activity, etc. The contractors make their judgments in each case "typically in less than a minute," according to ProPublica.

Carl Woog, WhatsApp's director of communications, verified to ProPublica that teams of contractors sift through messages in an attempt to remove abusive content. However, Woog told ProPublica that WhatsApp did not consider the contractors' roles to be that of content moderators.

ProPublica also reported obtaining a confidential complaint filed last year with the U.S. Securities and Exchange Commission (SEC) about how Facebook used outside contractors, along with artificial intelligence systems and account information, to monitor user messages. The SEC has not taken public action on the complaint, and an agency spokesperson declined to comment on the matter to ProPublica.

The investigation further asserted that Facebook downplayed the amount of data it collects from WhatsApp users, as well as how it used the data. "For example, WhatsApp shares metadata, unencrypted records that can reveal a lot about a user's activity, with law enforcement agencies such as the Department of Justice," ProPublica wrote.

In one instance, WhatsApp user data was given to prosecutors in a case against a Treasury Department employee who leaked confidential documents to a media outlet.

Will Cathcart, head of WhatsApp, has previously acknowledged that the company has worked with law enforcement.

"I think we absolutely can have security and safety for people through end-to-end encryption and work with law enforcement to solve crimes," Cathcart said during a YouTube interview in July with the think tank Australian Strategic Policy Institute (ASPI).

One social media user reacted to the news of the report by noting the seemingly contradictory assertions from Facebook and WhatsApp. He wrote, "Apparently, WhatsApp can't see your messages, yet they have a team of people who can see your messages and provide their content to law enforcement when necessary."

Apparently, WhatsApp can't see your messages, yet they have a team of people who can see your messages and provide their content to law enforcement when necessary. Don't trust Zuck with anything you consider private.

— Keith D. Wilson (@divisionbyzero) September 7, 2021

In an email to Newsweek, a WhatsApp spokesperson said workers looking into reported messages that are forwarded to them does not interfere with the service's encryption and privacy features.

"WhatsApp provides a way for people to report spam or abuse, which includes sharing the most recent messages in a chat," the spokesperson wrote. "This feature is important for preventing the worst abuse on the internet. We strongly disagree with the notion that accepting reports a user chooses to send us is incompatible with end-to-end encryption."

Update 09/08/21 9:10 a.m. ET: This story has been updated to include a comment from a WhatsApp spokesperson that was obtained after the story was originally published.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Jon Jackson is an Associate Editor at Newsweek based in New York. His focus is on reporting on the Ukraine ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go