News

AI Pimps Are Using Real Women's Videos To Produce Porn

AI pimps are on the rise, and they're making money by exploiting real women.

By Nicole Dominique2 min read
Instagram/@fit_aitana

You would think AI advancements would fuel efficiency and productivity, help people with their businesses, and more. It certainly does all that, but it also perpetuates degeneracy and raises ethical concerns. Artificial intelligence now allows men to live out their pornographic fantasies by using the likeness of women without their consent.

Recently, ABC Investigations discovered that an Adelaide, Australia man, Antonio Alvaro, was part of an international network described as "AI pimps." They utilized authentic images of women to produce AI-generated pornographic replicas, obscuring the models' faces with artificial ones, which were then sold online. AI pimps would sell content on subscription services like Fanvue, an OnlyFans-adjacent platform where people can purchase pornographic images and videos of the AI model.

An intellectual lawyer has described AI pimp's actions as "absolutely reprehensible."

Australian-based model Robyn Lawley shared a video by ABC Investigations exposing the activities of AI pimps. Lawley is promoting a petition to urge their government to protect women from these perpetrators. "Absolute craziness what has been allowed! Literally stealing a woman’s image and video and allowed to profit off it!!" she wrote in her caption. "Even worse they use it for porn videos without the woman’s consent! #ai is only getting better by the day, these websites have the ability to use the #ai technology easily. Time the government stepped up and protected our images or even helped with labeling of Ai altered images."

Model Celine Farach was one victim featured in the video. People had used her images to catfish men. When those same men found her real profile, they were angry at her. "My perspective on AI is that it's disheartening to see individuals exploiting it to repurpose videos for likes or even to catfish men," she wrote in an email to Evie.

"I've dealt with cases where men conducted a Google reverse image search of my video but with someone else's face on it, only to discover that they were deceived by the person they were communicating with on WhatsApp or online dating sites, leading to them losing thousands of dollars," she continued. "Some of these men still message me today, upset and angry at ME, actually, but I have no connection to that. They want me to take control of the situation, but unfortunately, it was and still is out of my control. The same goes for many other women out there. I cannot imagine how bad this will continue to turn out until they can figure out how to put an end to this."

It's important we acknowledge the real harm these deepfakes are causing women. The sick individuals who are exploiting AI to live out their fantasies and exploit women deserve serious repercussions. As these women continue to speak out against the rising trend of digital abuse, their stories underscore the importance of holding perpetrators accountable and protecting women from exploitation.

Support our cause and help women reclaim their femininity by subscribing today.