Deepfake 'Nudify' Technology Is Getting Darker-and More Dangerous
Briefly

Deepfake 'Nudify' Technology Is Getting Darker-and More Dangerous
"Open the website of one explicit deepfake generator and you'll be presented with a menu of horrors. With just a couple of clicks, it offers you the ability to convert a single photo into an eight-second explicit videoclip, inserting women into realistic-looking graphic sexual situations. "Transform any photo into a nude version with our advanced AI technology," text on the website says."
"Grok, the chatbot created by Elon Musk's companies, has been used to created thousands of nonconsensual "undressing" or "nudify" bikini images-further industrializing and normalizing the process of digital sexual harassment. But it's only the most visible-and far from the most explicit. For years, a deepfake ecosystem, comprising dozens of websites, bots, and apps, has been growing, making it easier than ever before to automate image-based sexual abuse, including the creation of child sexual abuse material (CSAM)."
Explicit deepfake services allow users to convert single photos into realistic explicit video clips and offer numerous pornographic templates, including undressing, oral, and semen scenes. These services charge small fees for videos and additional fees for AI-generated audio, and some site warnings ask for consent without clear enforcement. Chatbots and other tools have been used to mass-produce nonconsensual nudified images, normalizing digital sexual harassment. A broader deepfake ecosystem of websites, bots, and apps has expanded for years, lowering the barriers to automate image-based sexual abuse and increasing risks including creation of CSAM.
Read at WIRED
Unable to calculate read time
[
|
]