Ever taken a nude selfie? How about filming a kinky video for your partner? If so, then you could be at risk of falling victim to revenge porn. But don’t worry, Facebook have the answer! Of course they will need to review your nudes first. It’s for a good reason, honest…
What is revenge porn?
Revenge porn is when a person shares explicit content of someone else without their permission. In recent years this toxic trend has skyrocketed across social media platforms such as Facebook. In fact, a May 2017 The Guardian news article revealed that Facebook received more than 51,000 reports of revenge porn in January 2017 alone which led to the social networking giant disabling over 14,000 accounts.
It gets even worse; a 2016 Data and Society journal found that approximately 1 in 25 Americans have had someone post an image without their permission (or threaten to do so). Believe it or not, this figure rose to 1 in 10 for women under 30 years old.
Perhaps the most worrying statistic came from a June 2017 Facebook survey carried out by the anti-revenge porn advocacy group Cyber Civil Rights Initiative. This survey revealed that 1 in 20 social media users have posted a sexually graphic image without the consent of the person in the photo.
What will put an end to this revenge porn epidemic? Well, Facebook believe that the answer may lie within their new pilot program. Currently being tested in Australia, Facebook have joined forces with eSafety division of the Australian government in an effort to prevent explicit images from being shared without consent across all Facebook Groups, Messenger and Instagram platforms.
This pilot program involves Facebook asking users to upload any nude photos that they think may be distributed without their consent. By doing so, Facebook employees can review these uploaded images in order to prevent them from being shared. No we’re not kidding.
So how does it work?
First things first, a user who fears that they may be a victim of revenge porn fills out a form on the Australian eSafety Commissioner’s website.
From here, the user sends the explicit content in question to themselves via Facebook Messenger whilst the eSafety Commissioner notifies Facebook of the user’s photo submission.
Facebook’s community operations team will then use ‘image matching technology’ to prevent these photos from being uploaded or shared online. To do so, at least one ‘specially trained representative’ will review these images (Don’t worry, we’re sure they’ll only take a quick peek!).
After the photos have been reviewed, they will be ‘hashed’ – i.e. converted into an ‘unreadable numerical fingerprint’. This process will block any attempts from other users to upload these images via all of Facebook’s platforms. As Antigone Davis, Facebook’s Global Head of Security, explained in a November 2017 press release;
“We store the photo hash—not the photo—to prevent someone from uploading the photo in the future. If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches we do not allow it to be posted or shared… Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner’s office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers”.
Facebook will then notify the user to delete the image that they have sent to themselves. And hey presto your nudes are protected! In theory….
Is it safe?
Unfortunately, as with most news stories these days, the core concern seems to be one of confidentiality. With Facebook currently facing scrutiny due to various scandals (such as the use of Russian bots during the 2016 US Presidential campaign), many users are reluctant to trust that Facebook will simply review and delete these images rather than storing them. As digital forensics expert Lesley Carhart warned within a November 2017 VICE Motherboard article;
“Yes, they’re not storing a copy, but the image is still being transmitted and processed. Leaving forensic evidence in memory and potentially on disk… My speciality is digital forensics and I literally recover deleted images from computer systems all day—off disk and out of system memory. It’s not trivial to destroy all trace of files, including metadata and thumbnails”.
However, Julie Inman Grant, the commissioner of the e-Safety Commissioner’s Office, has been quick to reassure users that it is the links of these photos that will be stored rather than the explicit images themselves. Julie has explained within a November 2017 ABC News report that;
“They’re not storing the image, they’re storing the link and using artificial intelligence and other photo-matching technologies…So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded”.
Similarly, Cindy Southworth, the executive vice president and founder of the Safety Net Technology Project, stressed the importance of Facebook’s pilot program during a November 2017 Fast Company article, highlighting how;
“The timing is unfortunate because of scrutiny on other issues…I think it’s getting overblown because of much bigger frustration over Russia….I don’t want victims of domestic violence to lose out because of this timing…I applaud Facebook’s boldness in wanting to go the extra mile…Personally, I really hope that this overblown backlash doesn’t cause the company to back away from a plan to help victims globally. That would be devastating to me. There’s no other company saying [they’re] going to help proactively”.
The bottom line
As matters stand, this Australian pilot program is still in its infancy. However, if it proves successful then it could set a precedent for how Facebook resolves reports of revenge porn. What are your thoughts? Do you trust Facebook not to store your personal images? Or do you think there is a better solution for combating revenge porn? Let us know your thoughts in the comments section below!