“`html
How Deepfakes Cause “Embodied Harms”
During his preteen years in the 1970s, Spike Irons, now a porn actor and president of the adult content platform XChatFans, was “in love” with Farrah Fawcett. Though Fawcett did not pose nude, Jones managed to get his hands on what looked like pictures of her naked. “People were cutting out faces and pasting them on bodies,” Irons says. “Deepfakes, before AI, had been going around for quite a while. They just weren’t as prolific.”
People later used software like Adobe After Effects or FakeApp, which was designed to swap two individuals’ faces in images or videos. None of these programs required serious expertise to alter content, so there was a low barrier to entry. That, plus the wealth of porn performers’ videos online, helped make face-swap deepfakes that used real bodies prevalent by the 2010s. When, later in the decade, deepfakes of Gal Gadot and Emma Watson caused something of a broader panic, their faces were allegedly swapped onto the bodies of the porn actors Pepper XO and Mary Moody, respectively.
But it wasn’t just high-profile actors like them whose bodies were being used. Jennifer was “a very minor performer,” she says. “If it happened to me, I feel like it could happen to anybody who’s shot porn.” Since he started his practice in 2006, Silverstein says, “numerous clients” have reached out to report “This is my body on so-and-so.”
Both people whose faces appear in NCII deepfakes and those whose bodies are used this way can feel serious distress. Experts call this type of damage “embodied harms,” says Anne Craanen, who researches gender-based violence at the UK’s Institute for Strategic Dialogue, an organization that analyzes extremist content, disinformation, and online threats.
The term reflects the fact that even though the content exists in the virtual realm, it can cause physiological effects, including body dysmorphia. The face-swapped entity occupies the uncanny valley, distorting self-perception. After discovering their faces in sexual deepfakes, many people feel silenced, experts told me; they may “self-censor,” as Craanen puts it, and step back from public-facing life. Allison Mahoney, an attorney who works with abuse survivors, says that people whose faces appear in NCII can experience depression, anxiety, and suicidal ideation: “I’ve had multiple clients tell me that they don’t sleep at night, that they’re losing their hair.”
Independent creators aren’t just “having sex on camera.” For someone to rip off their work “for their own entertainment or financial gain fucking sucks.”
Though the impact on people whose bodies are used hasn’t been discussed or studied as often, Jennifer says that “it’s just a really terrible feeling, knowing that you are part of somebody else’s abuse.” She sees it as akin to “a new form of sexual violence.” The uncertainty that comes with not being aware of what your body is doing online can be highly unsettling. Like Jennifer, many adult actors don’t really know what’s out there. But some devoted followers know the performers’ bodies well—often recognizing tattoos, scars, or birthmarks—and “very quickly they bring [deepfakes] to the adult performer’s attention,” says Silverstein. Or performers will stumble upon the content by chance; some 20 years ago, for instance, the first such client to tell Silverstein her body was being used in a deepfake happened to be searching Nicole Kidman online when she found that one of the results showed Kidman’s face on her porn. “She was devastated, obviously, because they took her body,” he says, “and they were monetizing it.”
Otherwise, this imagery may be found by an organization like Takedown Piracy, one of several copyright enforcement companies serving adult content creators. US copyright violations can be challenging to prove if someone’s body lacks distinguishing features, says Reba Rocket, Takedown Piracy’s chief operating and marketing officer. But Rocket says her team has added digital fingerprinting technology to clients’ material to help flag and remove problematic videos, often finding them before clients realize they’re online.
By capturing “tens of thousands of tiny little visual data points” from videos, digital fingerprinting creates unique corresponding files that can be used to identify them, Rocket says—kind of like an invisible watermark. The prints remain even if pirates alter the videos or replace performers’ faces. Takedown Piracy has digitally fingerprinted more than half a billion videos and the organization has gotten 130 million copyrighted videos taken down from Google alone (though, of those videos, Rocket hasn’t tracked how many of these specifically include someone else’s face on a performer’s body).
- The use of porn actors’ bodies in deepfakes can cause serious psychological distress.
- Generative AI has made it easier and more dangerous for non-consensual uses to occur.
- Copyright enforcement companies like Takedown Piracy are helping remove problematic videos but face challenges due to the lack of distinctive features on some performers’ bodies.
“`
Originally published at technologyreview.com. Curated by AI Maestro.
Stay ahead of AI. Get the most important stories delivered to your inbox — no spam, no noise.




