Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous
In the shadowy corners of the internet, a chilling new frontier of digital abuse has emerged, powered by the relentless march of artificial intelligence. With just a few clicks, anyone can now transform a single photo into an eight-second explicit video, inserting women into hyper-realistic, graphic sexual scenarios. Welcome to the dark world of AI deepfake generators, where the line between reality and fabrication is vanishing—and the consequences are devastating.
One explicit deepfake website, which WIRED is not naming to avoid amplifying its reach, offers a menu of horrors that reads like a dystopian nightmare. For a small fee, users can generate videos where women are depicted undressing, engaging in explicit sexual acts, or being subjected to degrading scenarios like “fuck machine deepthroat” and various “semen” videos. Adding AI-generated audio costs extra, making the experience even more disturbingly lifelike. The site claims to require consent for uploaded photos, but it’s unclear if any real checks are in place to enforce this.
This isn’t just a niche problem. The deepfake ecosystem has exploded in recent years, fueled by advances in AI technology and the insatiable demand for nonconsensual pornography. From websites to Telegram bots and apps, the tools for creating explicit deepfakes are now more accessible than ever. And the harm they cause—particularly to women and girls—is profound.
“It’s no longer a very crude synthetic strip,” says Henry Ajder, a deepfake expert who has tracked this technology for over half a decade. “We’re talking about a much higher degree of realism of what’s actually generated, but also a much broader range of functionality.” Ajder estimates that these services are raking in millions of dollars per year, making them a lucrative—and deeply troubling—part of the AI revolution.
Over the past year, WIRED has documented how explicit deepfake services have rapidly expanded their capabilities. Image-to-video models now require just one photo to generate a short clip, and nearly all of the 50+ deepfake websites reviewed offer high-quality video generation. These sites often list dozens of sexual scenarios, from undressing to explicit acts, giving users unprecedented control over the content they create.
On Telegram, the problem is even more pervasive. Dozens of sexual deepfake channels and bots regularly release new features, such as different sexual poses and positions. In June 2023, one service promoted a “sex-mode,” encouraging users to “try different clothes, your favorite poses, age, and other settings.” Another promised “more styles” of images and videos, allowing users to “create exactly what you envision with your own descriptions” using custom prompts.
“It’s not just, ‘You want to undress someone.’ It’s like, ‘Here are all these different fantasy versions of it,’” says independent analyst Santiago Lakatos. “There’s versions where you can make someone [appear] pregnant.” Lakatos and media outlet Indicator have uncovered how these “nudify” services often rely on big tech infrastructure, making them both scalable and profitable.
The scale of the problem is staggering. A WIRED review found over 1.4 million accounts signed up to 39 deepfake creation bots and channels on Telegram. After WIRED raised concerns, Telegram removed at least 32 of these tools, citing violations of its terms of service. “Nonconsensual pornography—including deepfakes and the tools used to create them—is strictly prohibited,” a Telegram spokesperson said, adding that the platform removed 44 million pieces of content last year that violated its policies.
But the damage is already done. The deepfake ecosystem is a societal scourge, one of the darkest parts of the AI revolution. As technology continues to advance, the line between reality and fabrication will only blur further, leaving victims of this digital abuse to grapple with the consequences.
Tags & Viral Phrases:
- AI deepfake generators
- Nonconsensual pornography
- Digital sexual harassment
- Image-to-video models
- Hyper-realistic deepfakes
- Sexual abuse material (CSAM)
- Telegram deepfake bots
- AI-generated explicit videos
- Synthetic media revolution
- Deepfake ecosystem
- Undressing videos
- AI-generated audio
- Sexual positions and poses
- Big tech infrastructure
- Digital fabrication
- Societal scourge
- AI revolution
- Nonconsensual deepfakes
- Explicit video generation
- Deepfake technology
- AI-powered abuse
- Digital exploitation
- Hyper-realistic abuse
- AI-generated content
- Deepfake tools
- Sexual harassment
- AI advancements
- Digital harm
- Deepfake abuse
- AI-generated scenarios
,



Leave a Reply
Want to join the discussion?Feel free to contribute!