Young children exposed to online sexual exploitation attempts as criminal gangs produce 'deep fake nudes' from innocent photos
Briefly

Young children exposed to online sexual exploitation attempts as criminal gangs produce 'deep fake nudes' from innocent photos
"Children as young as five are being exposed to attempts at online sexual exploitation using sophisticated technology, including 'nudify' apps that create explicit images from innocent photos, gardaí have revealed."
"While garda advice in relation to sexual imagery has been to educate children and teenagers not to take and share intimate images of themselves or others, technology allows innocent photos to be manipulated to create a realistic-looking AI image of a person naked and engaging in sexual behaviour."
Sophisticated AI technologies are being used to produce realistic explicit content from ordinary photographs. Children as young as five are being targeted with attempts at online sexual exploitation using tools such as 'nudify' apps that transform innocent images into explicit ones. Existing safety advice has focused on educating children and teenagers not to take or share intimate images of themselves or others. Advances in image-manipulation tools now enable creation of realistic-looking AI images depicting people naked and engaging in sexual behaviour, undermining prevention strategies that rely solely on individual restraint. The realism and reach of these tools increase the risk of harm to vulnerable young people.
Read at Independent
Unable to calculate read time
[
|
]