Imagine what its like to be the victim of revenge porn.

Someone snaps an inappropriate photo of you in the bathroom, then you find out it is all over the Internet and on social media. It can cause a lot of stress and even depression.

Revenge porn is obviously unethical and immoral, Blank Rome divorce attorney Stacy Phillips told Fox News. Its a life-altering problem for the victim because being exposed in a compromising situation (such as being photographed or videotaped in the nude or in sexual acts) violates the victims privacy and causes immense humiliation.”

PORN SITES WANT YOU TO WATCH WORRY-FREE

Now, Facebook has decided to do something about the problem.

For now, theres no artificial intelligence that can identify an image as non-consensual; however, if a user labels an image as revenge porn, Facebook has started using pattern recognition that can block the propagation of those images and warn users about the issue.

The pattern recognition in this revenge porn application is a form of content-based retrieval or associative memory which is how human olfactory memory is thought to work, says Dr. Lav Varshney from the University of Illinois at Urbana-Champaign, an AI expert and member of IEEEs Signal Processing Society. If any future image is similar to what is memorized, it will point it out.

PRISONERS IN OHIO HID COMPUTERS FILLED WITH PORN, INFO ABOUT MAKING EXPLOSIVES

Varshney says the Facebook alerts help because the user might think twice about posting the image on other social networks or by text message in the future.

However, for AI to block the initial image, he says there has to be context. He says there is no AI that can understand the context of a photo and whether it was consensual or not. However, thats why the reporting step is so important and allows the machine learning to take over.

Matt Zeiler, the CEO and founder of machine vision company Clarifai, says the reason Facebook is using AI for revenge porn pattern recognition is due to the scale — there are hundreds of thousands of images shared and reposted on the social network, and no human operator could ever identify all of the images and block them. In the end, this does help the victims because it reduces the number of shares, which is often a cause for emotional distress.

FACEBOOK ROLLS OUT FEATURES TO CURB FAKE NEWS

Still, while the experts tell Fox News this is a good step, Facebook could do more.

Dr. Holly Jacobs, a victim of nonconsensual pornography herself, is the Founder, President, and Executive Director of the Cyber Civil Rights Initiative. She says she likes that Facebook is warning users and blocking additional posts, but even one posted revenge porn image is too many.

Once an image is uploaded, it’s out there for others to see, download, share, and upload to other sites within a matter of seconds, she says. The best way to prevent any harm from being done to a victim is to block an image from being uploaded or otherwise shared the first time around. We have been pushing and will continue to push the tech industry to make this possible.

FACEBOOK ADDS SUICIDE PREVENTION TOOLS FOR LIVE VIDEO

For that to happen, say experts, the AI will need to improve far beyond its current capability. As Varshney noted, it would have to understand the context before an image is ever posted — who is in the photo, why it might cause emotional harm, and why it should be blocked.

The AI will get there eventually, but for now — all of the experts applauded Facebook for taking action on this sensitive and potentially life-altering issue.

Read more: www.foxnews.com