Facebook Wants Your Nude Pictures in Latest Attempt to Stop Revenge Porn

Facebook is ramping up its efforts to combat the spread of non-consensual "revenge porn" images across its main website, Instagram and Messenger, it confirmed this week.

The move expands on a pilot scheme announced last year that asked any users who were concerned about the spread of intimate images to proactively upload them using the encrypted Messenger service, where photo-matching technology could prevent further sharing. The social network has stressed it does not store the images long term, but instead keeps their unique fingerprint, or "hash."

Initial tests rolled out to a small number of users in Australia last November. On Tuesday, Antigone Davis, Facebook's global head of safety, shared more details about how the plan had evolved in recent months. Now, the website is working closely with safety, survivor and victim organizations across in Canada, the U.K. and U.S. to help stop private moments being shared online without permission.

"It's demeaning and devastating when someone's intimate images are shared without their permission, and we want to do everything we can to help victims of this abuse," Davis wrote.

"We're now partnering with safety organizations on a way for people to securely submit photos they fear will be shared without their consent, so we can block them from being uploaded to Facebook, Instagram and Messenger," she continued. "This pilot program, starting in Australia, Canada, the UK and US, expands on existing tools for people to report this content to us if it's already been shared."

Partners will include Australia's Office of the eSafety Commissioner, the Cyber Civil Rights Initiative, the National Network to End Domestic Violence, the UK Revenge Porn Helpline and YWCA Canada.

Any users worried they may become a victim of revenge porn can now submit a form via one of the official partner groups, instead of relying on Facebook alone. After submitting the form, the victim will receive an email containing a "secure, one-time upload link" that can be used to upload the image.

A Facebook staffer will assign the picture a hash so that it can't be shared on any Facebook service, including Instagram. Once the unique fingerprint has been created the victim will be told via email and the image will be deleted from Facebook's servers within a week.

On Thursday, digital rights group Fight for the Future was criticized by Facebook security chief Alex Stamos for engaging in "shallow snark" about the anti- abuse scheme. Making a jab at the social network for the Cambridge Analytica data scandal, the advocacy group tweeted: "Facebook: We didn't protect your data and we are sorry. We will do better. Also Facebook: Yo, send us your nudes."

Stamos hit back with a lengthy thread. "I'm sad to see a civil rights advocacy organization approach an issue of such impact with shallow snark," he wrote. "They clearly did not read the original post by my incredible colleague […] nor comprehend the hard tradeoffs in dealing with revenge porn."

Revenge porn
Facebook's Antigone Davis: "We want to do everything we can to help victims of this abuse." But will technology really help? iStock

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Jason Murdock is a staff reporter for Newsweek. 

Based in London, Murdock previously covered cybersecurity for the International Business Times UK ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go