The Unseen Victims of Deepfake Porn: Porn Actors Whose Bodies Are Stolen

By

Deepfake pornography is often discussed in terms of the victims whose faces are superimposed onto explicit content without consent. But there's another group of people who suffer silently: the adult performers whose bodies are used—either directly or as training data—to create these fakes. This Q&A explores this hidden aspect of nonconsensual intimate imagery, based on the experience of Jennifer, a former adult content creator now working as a psychotherapist.

What happened to Jennifer when she searched for her old porn videos using facial recognition?

In 2023, Jennifer, now 37 and a psychotherapist in New York City, ran her professional headshot through a facial recognition program. She wanted to see if it would find the porn videos she made in her early 20s, over a decade ago. The technology did locate some of her original content, but it also returned something she had never seen before: one of her old videos, but with another person's face pasted onto her body. At first, she thought it was just a different person. Then she recognized a distinctive background from a 2013 video and realized, "Somebody used me in a deepfake." The algorithm identified her because the image still retained her features—cheekbones, brow, chin shape. She described the experience as feeling like she was wearing someone else's face as a mask.

The Unseen Victims of Deepfake Porn: Porn Actors Whose Bodies Are Stolen
Source: www.technologyreview.com

Why are conversations about deepfake porn often incomplete?

Discussions about sexualized deepfakes, which fall under nonconsensual intimate imagery (NCII), typically focus on the victims whose faces appear in explicit scenes—often celebrities, but increasingly ordinary women and youths. This has sparked alarm, fear, and some legislation. However, these dialogues largely ignore the bodies that the faces are attached to. Jennifer points out, "There's never any discussion about 'Whose body is this?'" For years, the answer has been adult content creators. Deepfakes first gained notoriety in 2017 when a Reddit user named 'deepfakes' swapped faces of stars like Scarlett Johansson onto porn actors' bodies. The adult performers' images and likenesses are used nonconsensually, yet their rights and experiences are seldom addressed.

How has generative AI made the situation worse for porn actors?

With the improvement of generative AI and the proliferation of 'nudify' apps, the issue has become more complex and dangerous. Porn actors' bodies are no longer always taken directly from existing sexual content in an identifiable way. Instead, their videos and images are used as training data to inform how new AI-generated bodies look, move, and perform. This threatens the livelihood and rights of adult performers, as their work trains AI nudes that could eventually replace them. The technology allows people to wholly recreate performers' likenesses without consent, potentially destroying their careers and income.

What legal protections exist for adult performers whose bodies are used in deepfakes?

Attorney Corey Silverstein, who specializes in adult industry law, notes that the nonconsensual use of bodies "happens all the time" in deepfakes. However, legal protections are limited. Most legislation around NCII targets the person whose face is used, not the person whose body is appropriated. Some states have laws against deepfake pornography, but they often require the victim to be identifiable, which is harder for body doubles. Federal laws like the SHIELD Act or the DEEPFAKES Accountability Act have been proposed but not fully enacted. Adult performers often have little recourse beyond copyright claims on their original content, which doesn't cover AI-generated versions.

The Unseen Victims of Deepfake Porn: Porn Actors Whose Bodies Are Stolen
Source: www.technologyreview.com

What is the deeper psychological impact on victims like Jennifer?

Jennifer's reaction captures the eerie violation: "It's like I'm wearing somebody else's face like a mask." Even though her face wasn't used, the deepfake still contained enough of her physical characteristics to be identified by facial recognition. The violation is not just about reputation but about body autonomy. Victims feel stripped of agency over their own image, even after they have left the adult industry. For Jennifer, now a psychotherapist, the discovery brought back trauma from her past and fear that her professional life could be jeopardized by something she can't control. The psychological toll includes anxiety, shame, and a constant sense of exposure.

How can society better address the hidden harm to adult performers?

To address the hidden harm, society must first acknowledge that the bodies used in deepfakes belong to real people—often adult content creators—who did not consent. This requires shifting the narrative from a singular focus on face-swapping to include body theft. Legislators should expand NCII laws to cover the nonconsensual use of a person's physical likeness, not just their face. Platforms must improve detection and removal of deepfakes that use real performers' bodies. Additionally, AI training data should be sourced ethically, with consent and compensation for adult performers. Only by recognizing the full spectrum of victims can we create meaningful protections.

Tags:

Related Articles

Recommended

Discover More

How to Catch Up and Watch Apple TV's Hottest Sci-Fi Returns This SummerMastering Top announcements of the What’s Next with AWS, 2026How MIT’s SEAL Framework Marks a Milestone in Self-Evolving AIFrom Learning to Landing: A Practical Guide to Breaking Into Cloud and DevOpsDoom 2016 Turns 10: Celebrate with a Steal of a Sale on Steam