As text-to-image generators become easy to build, use and customize, AI-generated porn communities are burgeoning on Reddit. But these synthetic sexual photos are built on non-consensual images of real people, experts say.

By Rashi Shrivastava, Forbes Staff


Reddit has long been a storehouse for porn, which it allows on its platform as long as it is consensual. Now, as text-to-image AI tools catch on, an avalanche of AI-generated pornographic images has descended upon the platform, accessible by its 57 million active users. While these images may be fake, the models used to create them are trained on photos of real people and can render facsimiles of them — mostly without their consent — experts told Forbes.

“The fundamental issue of generative AI porn is that it relies on the mass-scale theft of people’s, and disproportionately women’s, images and likeness,” said Noelle Martin, a survivor of sexual deepfakes and a legal researcher at University of Western Australia’s tech and policy lab.

Forbes found 14 subreddits dedicated to AI-generated porn, 10 of which have been created within the past two months. Each subreddit has thousands of members and thousands of visits each week. One of the most popular AI porn subreddits is r/unstable_diffusion, which has been live since September 2022 and has 54,000 members. Another has more than 60,000 members. Reddit has a toggle for “safe search” to hide adult content during regular search, but without this toggle, the search for “AI porn” on Reddit returns more than two dozen forums hosting and sharing AI-generated adult material. Within these subreddits, some users request or sell AI-generated sexual deepfakes of real people.


In “creating new content with my image to distribute and train the AI systems, it basically means that I as a person have zero control or ability to stop it.”

A victim of sex trafficking

As text-to-image AI tools become more sophisticated and their outputs more realistic, it’s becoming difficult for users to differentiate not only between what’s real and what’s not, but also what is consensual and what is not. In early April, some Redditors were fooled into paying $100 for the photos of a 19-year-old woman named Claudia, who was made using buzzy AI tool Stable Diffusion, according to Rolling Stone. Experts say that text-to-image systems of the type that contrive AI porn are trained on millions of images of real people, mostly without their knowledge or consent, as well as images owned by other entities. Artists and stock-image platforms like Getty have recently sued AI text-to-image generators like Stability AI and Midjourney for copyright infringement.

One victim of a sex trafficking case in which she and other women were coerced into having sex on camera, told Forbes that she believes the use of her images or videos for training AI systems would be a violation of her privacy. The victim, who spoke anonymously with Forbes out of concern that videos of her would resurface, says that sexually explicit videos of her were uploaded to Reddit and other sites without her consent and were later taken down.

“For people to still have copies of this is already a huge infringement on my legal rights,” she says. “So for them to be sharing these copies publicly, and then creating new content with my image to distribute and train the AI systems, it basically means that I as a person have zero control or ability to stop it.”

AI generated porn has also popped up on other platforms like Twitter and Instagram. But it has been predominantly flourishing on Reddit in part because the platform allows users to remain anonymous and does not prohibit them from sharing sexually explicit content. The platform has itself also been used to train large language models, and is now demanding payment: Reddit’s CEO Steve Huffman said in a recent interview with the New York Times that he wants Google, Microsoft and OpenAI to pay up for using Reddit’s data and conversations to train their algorithms. “The Reddit corpus of data is really valuable. But we don’t need to give all of that value to some of the largest companies in the world for free,” Huffman told the New York Times. Reddit recently announced that it will close its data API, which had been open since 2008, and plans to charge larger third parties for “premium access” to it.

With backing from heavyweights like Andreessen Horowitz and Sequoia Capital, Reddit was last valued at $10 billion after it raised $700 million in a Series F round in August 2021. With $1.9 billion in total funding, it is also reportedly planning to go public in 2023. Reddit booked $470 million in revenue in 2021 primarily through ad sales and premium subscriptions, according to estimates by Statista.

“I feel like with Reddit, you can kind of push the envelope a little bit with what’s acceptable in a way that you can’t on other platforms.”

Sophie Maddocks, a researcher on cyber sexual violence

While Reddit claims to not display ads against or monetize mature content on its site, experts say that allowing mature content on the site, artificial or real, draws more users to the platform and drives engagement. “Even though they don’t advertise on the subreddits, they still benefit from the fact that these threads draw a lot of people,” says Sophie Maddocks, a researcher on cyber sexual violence.

Users who come to the platform to consume porn also visit other pages and subreddits that do not contain NSFW content, says Jamie Cohen, a professor on social media and advertising aesthetics at CUNY Queens college. “Reddit actually tracks page visits and subreddit visits and then sends personalized data which offers new avenues for advertisers to use,” he says.


SCRAPING THE INTERNET FOR “AI GIRLS”

While Reddit is the largest social media platform that hosts AI porn for its tens of millions of users, a crop of specific AI porn sites have set up shop to monetize it.

Most AI porn generators like PornJourney, PornPen, PornJoy and SoulGen charge premium subscriptions and are connected to shadow Discord or Patreon accounts managed by anonymous users. Their websites display an array of AI-generated porn with different options for ethnicities and body types and instructions on how to create them — all with little to no disclosure of how the systems were trained and on what imagery. To that end, the sites also include disclaimers about content generated through its tools: “Any generations of content on this website that resemble real people are purely coincidental. This AI mirrors biases and misconceptions that are present in its training data.”

One such AI image generator PornJourney was created and launched in March 2023; it charges users $15 per month to create “AI girls” who look “real and human-like,” according to its website.

“Providing authentic and detailed photos of AI girls is costly, forcing us to invest continuously in our servers,” the site’s FAQ explains.

PornPen, which has 2 million monthly users and 12,000 users paying $15 per month for its AI porn generation tool, is built on Stable Diffusion’s AI model and sources images from a dataset called “LAION,” which contains roughly 6 billion images sourced from publicly available content on the web. The massive dataset includes images of celebrities, models, politicians and content creators. The nonprofit’s website says, “Our recommendation is to use the dataset for research purposes.”

Tori Rousay, an advocacy manager at the National Center on Sexual Exploitation (NCOSE), says that most AI porn text-to-image generators like Unstable Diffusion and Porn Pen use open source models from GitHub or Hugging Face to scrape images from porn websites and social media profiles and build a database of sexually explicit images. Forbes found at least five GitHub repositories for web crawlers that can be used to scrape images and videos from websites like PornHub and Xvideos as well social media sites like Twitter, Instagram and TikTok to build AI systems. PornHub and Xvideos did not respond to a comment request.

“Unstable Diffusion is like the cousin of Stable Diffusion. So what they did is they took their code, they replicated it, and they made their own repository based off of just pornographic and nude images,” says Rousay, who has researched how AI is used to make pornography. Female celebrities, politicians and journalists are most likely to be victims of AI porn because a vast amount of their visual content is available online, Rousay says. These individuals are also most targeted in the creation of deepfakes.


REDDIT’S GRAY ZONE

Reddit has tried to clarify its stance on this emerging area of sexually explicit content by banning non-consensual sexually explicit content including “depictions that have been faked,” referring to AI deepfakes — when an algorithm generates fake images of a real person. The site also prohibits “AI-generated material if it’s presented in a deceptive context,” according to a Reddit spokesperson. Reddit says that it uses a combination of automated tools and its internal safety team to detect and remove non-consensual explicit content from the site. But it doesn’t catch everything.

“Reddit is also known for having those blurred lines between what is morally questionable and what is banned from Reddit. So, AI-generated pornography kind of falls within Reddit’s gray zone,” Rousay says.

Deepfakes, non-consensual sexually explicit content, and generative AI porn all exist in the same pool of pornographic images that get recycled into new AI images. But it’s important to note the difference between generative AI porn, which depicts an entirely new unidentifiable person, and sexual deepfakes, where people’s faces are digitally stitched onto other people’s bodies. Researcher and deepfake survivor Martin says she faced the implications of this 10 years ago, long before generative AI existed, when someone photoshopped her selfies to fabricate pornographic images and videos of her. With technological advances in recent years, it’s getting harder to tell sexual deepfakes apart from AI porn, she says.

“Reddit is known for having those blurred lines between what is morally questionable and what is banned from Reddit.”

Tori Rousay, an advocacy manager at the National Center on Sexual Exploitation

Nicola Henry, who has been studying tech-facilitated sexual violence an, image-based sexual abuse for 20 years at the Royal Melbourne Institute of Technology, says that compared to sexual deepfakes, AI porn may seem more innocuous at first glance. But upon a closer look, some images show traces that the AI may be trained on underage images, she says. “I’ve seen some of the AI images on Reddit and the faces don’t look like women but underage girls so that’s concerning because if they were real, those images could qualify as child sexual abuse content.”

AI porn maker Unstable Diffusion has attempted to sidestep this problem by prohibiting users from using its tool to create CSAM and trying to source consensual images by asking people to provide them in exchange for early access and free membership to its AI porn generator. Now, it claims to have collected 15 million images “user donated” images for its training database.

Maddocks says that even if these tools are used for sexual expression and exploration, they foster unrealistic and unhealthy expectations around body image and sexual relations simply because of the corpus of data they feed on. Browsing through generative AI erotica on Reddit and Twitter, she says that these renderings reflect sexual stereotypes and racial biases found in mainstream media. “Because these tools are learning from the images that are already out there, or the text that’s already out there, when you ask them to produce a lot of imagery of marginalized sexual groups or of queer or trans people of color, they often come up short,” she says.

AI systems that are remixing content from sites like PornHub and Xvideos further pose a problem for adult actors who own the rights to their content, Maddocks says. Adult film actors and platforms hosting their work make money from subscriptions and advertising on their content, she says. But as of yet, they’re not being paid if their images are used to generate porn. Artists and platforms that have sustained financial losses from generative AI have taken action against AI tools like Stable Diffusion for using images without payment. Generative AI porn might spark a similar outcry in the adult film industry, Maddocks says.

Reddit has historically accommodated several other forms of sexually explicit and sexually abusive content within its communities. Years after controversial subreddits like r/jailbait and r/creepshots were removed, sexually explicit images from banned communities still linger on the platform. “I feel like with Reddit, you can kind of push the envelope a little bit with what’s acceptable in a way that you can’t on other platforms,” Maddock says. “You kind of are completely under the radar.”

MORE FROM FORBES

MORE FROM FORBESSexual Predators Are Grooming Young Teens On Wattpad, A Storytelling App Beloved By Gen ZMORE FROM FORBESElon Musk Inherited Twitter’s Child Abuse Nightmare-Experts Say He’s Making It WorseMORE FROM FORBESFacebook And Instagram Are Full Of Violent Erotica Ads From ByteDance- And Tencent-Backed AppsMORE FROM FORBESHow TikTok Live Became ‘A Strip Club Filled With 15-Year-Olds’