Vast pedophile network shut down in Europol's largest CSAM operationEuropol dismantled a significant dark web pedophile network, leading to numerous arrests and protecting child victims.
Feds test whether existing laws can combat surge in fake AI child sex imagesFederal efforts to prosecute AI-generated child exploitation cases face legal challenges that may slow progress against emerging risks.
The DOJ makes its first known arrest for AI-generated CSAMCSAM generated by AI is illegal and punishable under the law.
AI Is Triggering a Child-Sex-Abuse CrisisGenerative AI is increasingly being used to create sexually explicit images of children, highlighting an emerging crisis in safety.
Texan man gets 30 years in prison for running CSAM exchangeRobert Shouse was sentenced to 30 years in prison for running a dark web forum for child sexual abuse material and personally abusing children.
Europol Dismantles Kidflix With 72,000 CSAM Videos Seized in Major OperationA major law enforcement operation has dismantled the Kidflix streaming platform, which distributed child sexual abuse material, involving 1.8 million users globally.
Vast pedophile network shut down in Europol's largest CSAM operationEuropol dismantled a significant dark web pedophile network, leading to numerous arrests and protecting child victims.
Feds test whether existing laws can combat surge in fake AI child sex imagesFederal efforts to prosecute AI-generated child exploitation cases face legal challenges that may slow progress against emerging risks.
The DOJ makes its first known arrest for AI-generated CSAMCSAM generated by AI is illegal and punishable under the law.
AI Is Triggering a Child-Sex-Abuse CrisisGenerative AI is increasingly being used to create sexually explicit images of children, highlighting an emerging crisis in safety.
Texan man gets 30 years in prison for running CSAM exchangeRobert Shouse was sentenced to 30 years in prison for running a dark web forum for child sexual abuse material and personally abusing children.
Europol Dismantles Kidflix With 72,000 CSAM Videos Seized in Major OperationA major law enforcement operation has dismantled the Kidflix streaming platform, which distributed child sexual abuse material, involving 1.8 million users globally.
An AI Image Generator's Exposed Database Reveals What People Really Used It ForAI-generated adult imagery can include real photographs and face-swapping techniques, raising ethical concerns about consent and moderation.
We're unprepared for the threat GenAI on Instagram, Facebook, and Whatsapp poses to kidsSocial media platforms are inundated with AI-generated Child Sexual Abuse Material (AIG-CSAM), creating challenges for law enforcement and anti-CSAM institutions.
An AI Image Generator's Exposed Database Reveals What People Really Used It ForAI-generated adult imagery can include real photographs and face-swapping techniques, raising ethical concerns about consent and moderation.
We're unprepared for the threat GenAI on Instagram, Facebook, and Whatsapp poses to kidsSocial media platforms are inundated with AI-generated Child Sexual Abuse Material (AIG-CSAM), creating challenges for law enforcement and anti-CSAM institutions.
UK's internet watchdog puts storage and file-sharing services on watch over CSAM | TechCrunchOfcom launches enforcement program to tackle child sexual abuse material on file-sharing services under the U.K.'s Online Safety Act.
X fails to avoid Australia child safety fine by arguing Twitter doesn't existX Corp's lack of transparency on CSAM led to civil penalties and possible costly repercussions under Australian law.
UK's internet watchdog puts storage and file-sharing services on watch over CSAM | TechCrunchOfcom launches enforcement program to tackle child sexual abuse material on file-sharing services under the U.K.'s Online Safety Act.
X fails to avoid Australia child safety fine by arguing Twitter doesn't existX Corp's lack of transparency on CSAM led to civil penalties and possible costly repercussions under Australian law.
Europol arrests 25 users of online network accused of sharing AI CSAMAI-generated deepfake porn is criminalized in South Korea, prompting worldwide concern over child sexual abuse material (CSAM) laws.Europol's Operation Cumberland underscores the lack of clear legislation across countries to address AI-generated CSAM.
Telegram U-turns and joins child safety schemeTelegram's recent partnership with the Internet Watch Foundation marks a significant pivot toward addressing child sexual abuse material.
Nonprofit scrubs illegal content from controversial AI training datasetThe LAION-5B dataset has been re-released as Re-LAION-5B, now cleaned of links to child sexual abuse materials (CSAM).
Amazon, Google and verification vendors among ad tech cohort under fire from U.S. senators over child safety shortcomingsAdalytics' recent report reveals that major ad tech companies are serving ads on websites with child sexual abuse material (CSAM), raising serious concerns among lawmakers.
Tennessee, Connecticut Lawmakers Demand Accountability Over Ads On CSAM-Hosting WebsitesU.S. Senators seek accountability from digital ad platforms for hosting child sexual abuse material, highlighting critical oversight failures.
Under new law, cops bust famous cartoonist for AI-generated child sex abuse imagesCalifornia law bans possession or distribution of AI-generated child sex abuse material (CSAM), reflecting concerns over its inherent dangers to children.
Europol arrests 25 users of online network accused of sharing AI CSAMAI-generated deepfake porn is criminalized in South Korea, prompting worldwide concern over child sexual abuse material (CSAM) laws.Europol's Operation Cumberland underscores the lack of clear legislation across countries to address AI-generated CSAM.
Telegram U-turns and joins child safety schemeTelegram's recent partnership with the Internet Watch Foundation marks a significant pivot toward addressing child sexual abuse material.
Nonprofit scrubs illegal content from controversial AI training datasetThe LAION-5B dataset has been re-released as Re-LAION-5B, now cleaned of links to child sexual abuse materials (CSAM).
Amazon, Google and verification vendors among ad tech cohort under fire from U.S. senators over child safety shortcomingsAdalytics' recent report reveals that major ad tech companies are serving ads on websites with child sexual abuse material (CSAM), raising serious concerns among lawmakers.
Tennessee, Connecticut Lawmakers Demand Accountability Over Ads On CSAM-Hosting WebsitesU.S. Senators seek accountability from digital ad platforms for hosting child sexual abuse material, highlighting critical oversight failures.
Under new law, cops bust famous cartoonist for AI-generated child sex abuse imagesCalifornia law bans possession or distribution of AI-generated child sex abuse material (CSAM), reflecting concerns over its inherent dangers to children.
Child predators are using AI to create sexual images of their favorite stars': My body will never be mine again'Predators on the dark web are increasingly using AI to create sexually explicit images of children, particularly fixating on 'star victims.'
Amazon, Google accused of monetizing illegal contentUS Senators called out Amazon and Google for enabling ads on CSAM sites, questioning their technology's effectiveness.
Checkmarked X Users Caught Promoting Sites That Sell Child Sex Abuse VideosThe verification system on X has proven ineffective in preventing the promotion of child sexual abuse material.
Apple sued for failing to implement tools that would detect CSAM in iCloudApple is being sued for failing to implement iCloud scanning for child sexual abuse material, leading to harm for victims.
Snap calls New Mexico's child safety complaint a 'sensationalist lawsuit'Snap argues New Mexico's AG manipulated evidence to portray its app negatively.The company emphasizes its compliance with CSAM reporting laws.State officials highlight serious concerns about children's safety in their allegations.
The world's leading AI companies pledge to protect the safety of children onlineLeading AI companies pledge to prevent AI exploitation of children for CSAM, aiming to protect children from abuse using generative AI.
Child predators are using AI to create sexual images of their favorite stars': My body will never be mine again'Predators on the dark web are increasingly using AI to create sexually explicit images of children, particularly fixating on 'star victims.'
Amazon, Google accused of monetizing illegal contentUS Senators called out Amazon and Google for enabling ads on CSAM sites, questioning their technology's effectiveness.
Checkmarked X Users Caught Promoting Sites That Sell Child Sex Abuse VideosThe verification system on X has proven ineffective in preventing the promotion of child sexual abuse material.
Apple sued for failing to implement tools that would detect CSAM in iCloudApple is being sued for failing to implement iCloud scanning for child sexual abuse material, leading to harm for victims.
Snap calls New Mexico's child safety complaint a 'sensationalist lawsuit'Snap argues New Mexico's AG manipulated evidence to portray its app negatively.The company emphasizes its compliance with CSAM reporting laws.State officials highlight serious concerns about children's safety in their allegations.
The world's leading AI companies pledge to protect the safety of children onlineLeading AI companies pledge to prevent AI exploitation of children for CSAM, aiming to protect children from abuse using generative AI.
Apple sued over abandoning CSAM detection for iCloud | TechCrunchApple is being sued for not implementing a system to detect child sexual abuse material in iCloud, allegedly impacting victims' trauma.
Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting toolApple faces lawsuits from survivors of sexual abuse due to its failure to effectively address child sexual abuse material on its platforms.
Apple sued over abandoning CSAM detection for iCloud | TechCrunchApple is being sued for not implementing a system to detect child sexual abuse material in iCloud, allegedly impacting victims' trauma.
Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting toolApple faces lawsuits from survivors of sexual abuse due to its failure to effectively address child sexual abuse material on its platforms.
Bluesky ramps up content moderation as millions join the platformBluesky is increasing its content moderation team to address a rise in concerning user content amid rapid growth.
AI-generated child sexual abuse imagery reaching tipping point', says watchdogAI-generated child sexual abuse imagery is increasingly prevalent online, with reports significantly rising in the past six months.
Telegram's Durov must remain in France and post a 5M bailPavel Durov, founder of Telegram, faces serious criminal charges in France including money laundering and CSAM distribution.
Was an AI Image Generator Taken Down for Making Child Porn?AI companies face scrutiny for enabling tools that facilitate the creation of child sexual abuse material, raising ethical and legal concerns.