Facebook announced Wednesday that it is taking further steps to help purge “revenge porn” from its platform by introducing new preventative tools.
One of the measures is stoping users from reposting lewd images — like pornographic ones that show someone in risqué positions and partially or fully naked — by identifying and cataloging specific images already reported.
“Revenge porn” is an informal term for a nude photograph or video that is publicly and non-consensually shared (typically on the internet) in order to humiliate the subject. Such content is usually put online by an ex-lover out of spite, and has become a niche porn genre since the advent of internet pornography.
The tech company is using “photo-matching technologies to help thwart further attempts to share the image on Facebook, Messenger and Instagram,” which are also owned by the tech conglomerate.
Facebook likely felt the need to substantially curb the depraved practice because of public backlash.
A Houston woman in January said she was suing Facebook for $123 million because it failed to remove images that falsely showed her face on top of naked bodies, according to The Huffington Post.
“According to a study of US victims of non-consensual intimate images, 93% report significant emotional distress and 82% report significant impairment in social, occupational or other important areas of their life,” Antigone Davis, head of global safety, wrote on an official Facebook blog post, highlighting why they think new tools are important and needed.
More than half of victims indicate that they have considered committing suicide due to revenge porn, according to the Cyberbullying Research Center.
Women are disproportionately affected by the criminal phenomenon. More than 80 percent of cyber-stalking defendants are male, reports TIME, which cites Department of Justice records, and 90 percent of the victims in a study of 1,606 revenge porn cases are females targeted by males. (RELATED: ‘Creeping’ On Facebook Just Got A Whole Lot Harder With New Feature)
Davis clarifies their reporting protocol and reminds people how to report an image as a violation of Facebook guidelines.
She says that “specifically trained representatives” review images, remove them if they are inappropriate, and, in most cases, disable the accounts that share the content.
All of the tools were developed with the help of more than 150 safety organizations and experts from around the world, including Kenya, India, Spain, Sweden, the Netherlands, Turkey, Washington D.C., New York and Ireland.
Facebook is becoming way more than a social media company, having rolled out tools for a number of initiatives, including fighting “fake news,” preventing suicide on its live-streaming function (Facebook Live).
Send tips to email@example.com.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact firstname.lastname@example.org.