Facebook Says Its Employees Will View Your Nudes whether You utilize Its Anti-Revenge Porn Program


In an attempt to combat the rise of revenge porn on its platform, Facebook is asking users to upload any nude photos they mediate may be distributed without consent — a process which involves a Facebook employee reviewing the uploaded images.

Piloting the program in Australia, Facebook has teamed with the Australian government’s eSafety division, with an aim to prevent intimate images being shared without consent on each and every of its platforms (this includes Messenger, Instagram and Facebook Groups).

The entire process is as follows:

  • A person worried that intimate photos of themselves are being shared online fills out a form on the eSafety Commissioner’s website;
  • The user then sends the photo(s) to themselves on Facebook Messenger;
  • While this is happening, the eSafety Commissioner’s office notifies Facebook of the person’s submission;
  • Facebook’s community operations team uses “image matching technology” to prevent the image being uploaded or shared online. At least one “specially-trained representative” will review your image(s) before hashing them.
  • Hashing an image converts it into a digital fingerprint — a series of numbers — that are used to block attempts to upload the image to Facebook’s platforms.
  • The user is then prompted by Facebook to delete the image they hold sent to themselves.

In a blog post on Thursday, Facebook confirmed that at least one company employee will view the nude photos users upload.

In a post on its Newsroom portal, Facebook’s global head of safety, Atigone Davis, wrote that a “specially-trained representative” from the social network’s Community Operations team will review the image before “hashing” it.

Facebook then stores the hash, which it says “creates a human-unreadable, numerical fingerprint of it,” but not the photo itself. This helps to prevent future uploading — whether you’re comfortable with an employee seeing your nudes.

This original system from Facebook builds on an announcement in April, at which the company first said it would be introducing original tools to support people who had images shared on Facebook without consent.

Previously, users were encouraged to utilize Facebook’s “report” feature to block images from being shared that were already uploaded.

The original hashing program will give users the ability to inform Facebook themselves and thus stop the image from being uploaded in the first set.

“The safety and well-being of the Facebook community is our top precedence,” said Davis in a statement.

“These tools, developed in partnership with global safety experts, are one example of how we’re using original technology to preserve people safe and prevent harm.”



Source link

You might also like More from author