Deepfakes Are The Latest Digital Threat To Women
Deepfakes are so new it’s pretty easy to tell they’re not real. The technology is still growing and images are artificial looking when closely examined.
But as AI capabilities improve and fake videos continue to spread, have we thought of the full repercussions of videos that can depict specific individuals without their consent or without them actually being filmed?
What Exactly Are Deepfakes?
Deepfakes are videos that use advanced technology to impose someone’s face on another person’s body. They became commonly known in 2017 when porn videos surfaced depicting women performing sex acts with a celebrity’s face super-imposed on their bodies. The term was actually coined by a Reddit user.
Since then the technology has improved and has been used by comedians to poke fun at politicians and other famous people. The Fresh Prez of D.C. is a perfect example. In this hilarious YouTube Show, Kyle Dunnigan mocks people on the Right and the Left, but in such a lighthearted nonsensical way that everyone can enjoy his silly deepfakes.
Deepfakes are videos that use advanced technology to impose someone’s face on another person’s body.
It’s funny and entertaining. This brings these kinds of videos further into the mainstream. They seem like a fun use of the technology when used for comedy.
The Destructive Power of Deepfakes
Unfortunately, deepfakes cause serious issues when misused to harass individuals, and most of the time, women are the targets. Rana Ayyub is an Indian journalist. She was very outspoken in her anti-establishment views. Then, in 2018, someone created a deepfake porn video to discredit her and her views.
She wrote an op-ed about her reaction and how this simple deepfake changed how she approaches her career. Ayyub was sent a video, supposedly of her performing pornographic acts, by a connection. She said, “When I first opened it, I was shocked to see my face, but I could tell it wasn’t actually me because, for one, I have curly hair and the woman had straight hair. She also looked really young, not more than 17 or 18.”
“I started throwing up,” Ayyub said. “I just didn’t know what to do. In a country like India, I knew this was a big deal. I didn’t know how to react, I just started crying.”
The deepfake porn video went viral, with more than 40,000 shares. Ayyub’s Twitter, Facebook, and Instagram were all flooded with comments, screenshots, and shares of this video.
“It ended up on almost every phone in India,” Ayyub said. “It was devastating. I just couldn’t show my face. You can call yourself a journalist, you can call yourself a feminist, but in that moment, I just couldn’t see through the humiliation.”
Ayyub went on to explain how this horrible experience changed how she interacts online. “From the day the video was published, I have not been the same person. I used to be very opinionated, now I’m much more cautious about what I post online. I’ve self-censored quite a bit out of necessity...I’m constantly thinking what if someone does something to me again. I’m someone who is very outspoken so to go from that to this person has been a big change.”
Ayyub added, “I always thought no one could harm me or intimidate me, but this incident really affected me in a way that I would never have anticipated.”
90% of deepfake videos depict non-consensual porn featuring women.
Ayyub isn’t the only victim, and this seems to be most commonly used as a tactic to silence women. The number of deepfakes circulating has been doubling every 6 months, and 90% of the videos depict non-consensual porn featuring women.
Furthermore, what happens when deepfakes become more user friendly? If someone wanted to implicate another person in a crime it would be much easier using face-swap technology.
Deepfake Technology Is Ahead of the Law
Just this year, the mother of a highschool cheerleader was facing charges of harassment involving deepfakes until the DA’s office dropped the charges. (The video was supposedly intended to discredit another girl on the cheer squad and get her kicked off.) Because deepfakes are currently so new, and not easy to create, it should be easy to trace their creators, but surprisingly, it’s not. The investigation produced no evidence that the mother had the skills or the technology to create the video in question, or that the video itself wasn’t real.
What’s most shocking about this specific case is that the local police were convinced she did it. A leading researcher and policy adviser on deepfakes, Henry Ajder, noted that there aren’t many people in the U.S. who can properly vet deepfakes. They would need to use “specific computational forensic techniques” and go “through it frame by frame to comb for clues to be able to say with authority if it is real or not.”
There aren’t many people in the U.S. who can properly vet deepfakes.
Furthermore, laws regarding deepfakes are scarce. The 2021 National Defense Authorization Act requires the Department of Homeland Security to produce an annual report on deepfakes for the next five years. In addition, deepfake technology is to be closely studied.
That’s not enough for some states. Virginia banned deepfake porn. Texas has banned deepfakes aimed to influence elections, and California has banned their production within 60 days of an election. It seems that most American lawmakers are mainly focused on protecting themselves from deepfakes right now.
China banned deepfakes in 2019, but the question of whether or not banning this new technology is legal in a free country – where satire and video imaging are such a big aspect of our culture – is being raised.
Closing Thoughts
If the rights of law-abiding citizens are being abused by individuals weaponizing a new technology against them, new laws may need to be implemented. Whether or not we know it, deepfakes are increasing in popularity. They can be silly and simple or destructive and consequential. Regardless, they definitely reinforce the fact that we can’t believe everything we see on a screen.
Love Evie? Let us know what you love and what else you want to see from us in the official Evie reader survey.