Meta Collaborates With Industry Experts On New Method For Detecting And Removing Revenge Porn

Meta is joining a new campaign to help safeguard people from revenge porn,' in which intimate images of them are posted online without their permission. Since 2018, Meta has had mechanisms in place to help detect and remove revenge porn, but now the firm is joining a coalition of assistance organizations and digital platforms on a new program that will give customers an alternative means to track and stop the exploitation of their photographs online.


Today, Meta and Facebook Ireland are partnering with the UK Revenge Porn Helpline and more than 50 groups across the world to create StopNCII.org. This platform is the first of its type in the world, designed to help people who are concerned that their intimate photographs (shots or films of a person that contain nudity or are sexual in nature) may be published without their permission. Thanks to considerable input from victims, survivors, specialists, advocates, and other tech partners, the UK Revenge Porn Helpline, in collaboration with Meta, designed this platform with privacy and security in mind at every step.


The procedure is as follows: if you're concerned that photographs or videos of you are being shared online without your permission, go to StopNCII.org and file a complaint. Creating a case entails using your device to 'digitally fingerprint' the content in question.


Your content is not uploaded or duplicated from your device, as mentioned here, but the system will scan it and generate a 'hash' that will be used for matching. Only the hash is provided to StopNCII.org; the image or video connected with it is kept on your device and not uploaded.


The unique hash is then shared with partner tech platforms, including Meta, so that they can detect and remove any versions of the photos that have been shared, or attempted to be shared, across their apps.


It's a good, organized way to deal with what can be a catastrophic crime, with users being publicly recognized and shamed via social media, perhaps suffering long-term psychological and perceptual damage. And, according to research, one out of every 12 adults in the United States has been a victim of image-based abuse, with young people being disproportionately affected. It's a serious problem, far more so than many people realize.


During the epidemic, the prevalence of revenge porn has increased, with the UK domestic violence charity Refuge reporting a 22% increase in revenge porn reports over the previous year. Simple remedies like "simply don't take pictures of yourself" misunderstand cultural trends and are no help in retrospect. It's critical that Meta, and other social platforms, do everything they can to address this growing concern and assist harmed people.

The widespread use of this hash-based method could be a significant step toward streamlining the procedure and, perhaps, providing victims with a more straightforward path to action.


3 views0 comments

Discover Roundabout's free reporting tool for every social media campaign

More Roundabout

Never miss an update

Thanks for submitting!

Download the app

Apple and Android

apple_google_edited.png
apple_google_edited.png