Skip to content
“Completely horrible, dehumanizing, degrading”: a woman’s fight against deepfake porn


Watch CBSN Originals documentary “Deepfakes and the Fog of Truth” in the video player above. It premieres on CBSN on Sunday, October 17 at 8 p.m., 11 p.m. and 2 a.m. ET.


When Australian law student Noelle Martin was 18, she did what everyone who grew up with the internet does: Googled herself. But rather than finding photos from her family vacation, she was shocked to find explicit photos of herself posted on porn sites. Martin, however, never took these photos. Her face had been retouched on the bodies of adult film actresses and posted online.

“I saw pictures of me having sex, pictures of me in solo positions where my face was doctored over the naked bodies of adult actresses,” she said.

Martin’s social media accounts had been used to put together photos which were then used to create realistic-looking graphic images. She wasn’t sure who created the images or why she was being targeted, but these graphic images continued to appear on fetish websites.

“I’m not a public figure, I’m not a celebrity. I’m literally just an ordinary person, literally someone from Perth, Western Australia,” Martin told CBSN Originals. “And I didn’t have any partner or ex-partner who could have done anything like that to me.”

Martin tried to contact police, private investigators, and government agencies, but since she didn’t know where the footage came from, there was no way to hold the creators to account. Martin even attempted to contact the operators of the porn sites that hosted the porn photos of her, but those efforts sometimes led to further abuse.

“Sometimes I would get a response and they would delete it and then it would reappear two weeks later,” she said. “And then once an author said he would only remove the material if I sent him nude photos of me within 24 hours.”

“Completely horrible, dehumanizing, degrading”: a woman’s fight against deepfake porn
Noelle Martin has spent years fighting the spread of deepfake porn online. “It was completely horrible, dehumanizing, degrading, violating to just see yourself distorted and misused in this way,” she said.

CBS News


Martin’s efforts to clean up the internet of non-consensual porn depicting her failed and the footage continued to escalate. Martin says she received an email at work from someone anonymously telling her that there were now videos of her having sex posted on porn sites on the internet. These videos were deepfakes – manipulated media forms that used sophisticated digital tools to edit Martin’s face on the body of an adult film actress.

“It was completely horrible, dehumanizing, degrading, violating to just see yourself distorted and hijacked in this way,” Martin said.

In recent years, improvements in artificial intelligence have made it possible to cheaper and more accessible technology in the world of visual effects, allowing the creation of deepfakes, where a person in a video can be replaced by someone else, almost reproduce their resemblance and movements. While these innovations democratized the creation of special effects, technology has been widely used to attack women.

According to a 2019 report by cybersecurity firm Deeptrace, 96% of all online deepfakes are porn, and the top five deepfake porn sites exclusively target women.

Adam Dodge, lawyer and founder of endtab, an organization that works to end technology abuse, says deepfakes have been used to target women since the technology was released to the general public.

“In early 2018, we learned about deepfakes by being exposed to celebrity deepfake pornography,” Dodge said. “So there wasn’t even an opportunity to look at it from a different angle than, ‘Hey, this could be used in film editing or to put someone in a Star Wars movie.’ It was: ‘ No, this is a form of militarized technology used to harm women and girls online. ‘”

According to Dodge, the legislative system has been slow to respond to the threat that deepfakes pose to women.

“In most states, non-consensual pornography is illegal,” he explained. “However, if you are creating non-consensual deepfake pornography, these laws will not apply because it is not the victim’s body that is depicted in the video. It is just their face. So the video does not. will not meet the threshold to be prosecuted under this law. “

The owner of one of the most popular deepfake porn sites on the internet, who goes by the pseudonym Dom, claims that his site exclusively publishes non-consensual celebrity deepfake porn.

“I don’t feel bad for celebrities,” Dom said. “I think as a public figure they’re better equipped for what’s going on. I think they know that too, about people fantasizing about them.”

Dom, who created his site after Reddit banned his deepfake subreddit for breaking its unintentional pornography rules, says his site receives around 350,000 visitors per day. And while its site’s main page focuses on public figures, it also hosts a forum that serves as a marketplace where deepfake makers can connect with people who are making non-celebrity deepfake requests.

“It could all be happening behind the scenes and I wouldn’t know it or the moderators wouldn’t know it,” Dom said. “It’s hard to control everything.

Dom says he feels bad for non-celebrities who fall victim to non-consensual deepfake porn, and says that despite struggling to control forums he will ban users if he finds out they’re making videos of non-celebrities. But Dom thinks the videos he’s posting are clearly fake, so he’s not shy about keeping his site live.

“If users make sure people know it’s fake and that it’s clearly labeled as fake for entertainment purposes, as long as they don’t try to pass it off as the real thing, it is basically where my threshold is, ”he said.

But for victims of deepfake porn, acknowledging that the media has been manipulated does not diminish the impact.

“Ultimately, they are fetishized and sexualized without their consent,” Dodge said. “And it doesn’t matter if the video or photo that is being distributed or shared is believed by the viewer, it’s extremely dangerous because it doesn’t feel good to have it out there.”

Martin says being a victim of image-based abuse has changed the trajectory of his life.

“It robs you of opportunities, and it robs you of your career, your hopes and your dreams,” she said. “I was admitted as a lawyer, and that’s the only thing I’ve always wanted to do. And particularly in this area where everything is a question of name, image and reputation, he told me. was extremely difficult to find a job. ”

Since Martin first found spoofed images of herself on Google, she has become an advocate for victims of similar forms of non-consensual pornography. She says her outspokenness has made her an even bigger target of abuse.

“Because I was talking about it, the writers decided to create fake videos of me,” Martin said. “You only look to lose when you talk about something like that, because when you dare to talk about this kind of abuse, you expose yourself to more people who see the very thing that you don’t want people to see. . “

Despite attempts to silence her, Martin remained an outspoken activist. She advocated for legal solutions that led Australia to criminalize image abuse in 2017. But those years have taken their toll.

“It was such an overwhelming experience,” said Martin. “And I actually think it caused me more pain than it gave me strength or resilience, especially since it’s not a great way to look at the world. really almost destroyed. “