How Instagram And Facebook Use Artificial Intelligence (AI) To Fight Revenge Porn

How Instagram And Facebook Use Artificial Intelligence (AI) To Fight Revenge Porn

Artificial intelligence (AI) and machine learning are Facebook's new partners in the battle against revenge porn—when intimate images or videos are shared on the internet without the consent of the subject. The act is intended to embarrass or cause distress to the subject of the image.

How Instagram And Facebook Use Artificial Intelligence (AI) To Fight Revenge Porn

According to Antigone Davis, Global Head of Safety for the company in the statement announcing the changes, “By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram.”

Revenge Porn and Its Consequences

According to a study conducted in Australia, nearly one in 10 adults admitted to taking nude photos or recorded video of others without consent. More than 6 per cent shared the images or videos and close to 5 per cent made threats to do so. One in 25 Americans has been a victim of revenge porn according to the Data & Society Research Institute, while approximately 10 million have been threatened with having their images shared. A portion of posters are hackers who found images in emails or on servers rather than jilted lovers. In fact, 78 per cent of posters aren't actually motivated by negative feelings toward the victim, so organisations such as the Cyber Civil Rights Initiative prefer the term "non-consensual pornography" instead of revenge porn.

Revenge porn can have serious ramifications for the victims and is a form of sexual violence that’s motivated by the desire to inflict humiliation, shame,  and to control and terrorise the victim. Victims suffer long after the images are removed and often have to deal with mental health consequences that include suicidal thoughts, post-traumatic stress disorder (PTSD), anxiety, depression and more. In addition to mental health, victims can suffer from social shaming and negative professional ramifications, including losing their jobs.

Removal Without Reporting

Previously, Facebook’s policy has been to remove non-consensual intimate images when they were reported. Then, the company would use photo-matching technology to ensure that the images weren’t reshared.

Today, AI and machine learning allow Facebook to find "near-nude images or videos" before anyone reports them. Also, victims might be afraid to report the image for fear of retaliation or in many cases; they might not even realise the content was shared. Once the technology finds questionable content, a specially-trained member of Facebook's Community Operations team will review it against the company's Community Standards. If the image does violate the company's Community Standards, it will be removed, and in most cases, the account from where it was shared will be disabled.

Additional Safeguards and Victim Resources

Facebook will also expand its current pilot programme that’s been run in collaboration with victim advocate organisations that gives individuals an emergency option to securely submit a photo to Facebook to prevent it from ever being shared on the platform at all.

In addition, a victim-support hub called Not Without My Consent is available through Facebook’s Safety centre to targets of revenge porn exploitation. This resource helps victims find organisations and information for support when involved in a revenge porn incident. Facebook also plans to create a victim support toolkit that's locally and culturally relevant in partnership with the Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-yeon (South Korea).

Research-Based Approach to Revenge Porn

In pursuit of better protecting victims, Facebook undertook its own research and partnered with international safety organisations to determine the best way to address revenge porn when found anywhere on Facebook, Instagram or Messenger. The victim-focused study attempted to understand the considerations of victims, how the reporting tools needed to be changed to better support them, and how best to safeguard the platform.

Facebook used the information gathered from the research and conversations with victims to alter their tools. It learned that the reporting tools needed to be easier to use and that victims needed a fast, personalised response to their claim. They also realised that some people didn’t even know the reporting tools were available.

The announcement of these new tools to fight revenge porn is in line with CEO Mark Zuckerberg's commitment to the company's pivot to privacy where the goal is to build a privacy-focused social networking and messaging platform as well as be a company that “can evolve to build the services that people really want.”

With the help of artificial intelligence and machine learning, hopefully, Facebook has the systems in place to combat revenge porn on its platform.


 


 

Written by

Bernard Marr

Bernard Marr is an internationally bestselling author, futurist, keynote speaker, and strategic advisor to companies and governments. He advises and coaches many of the world’s best-known organisations on strategy, digital transformation and business performance. LinkedIn has recently ranked Bernard as one of the top 5 business influencers in the world and the No 1 influencer in the UK. He has authored 16 best-selling books, is a frequent contributor to the World Economic Forum and writes a regular column for Forbes. Every day Bernard actively engages his almost 2 million social media followers and shares content that reaches millions of readers.

Some Of Our Customers

Connect with Bernard Marr