Thousands Of Realistic But Fake AI Child Sex Images Found Online

According to The Washington Post, thousands of realistic but fake AI-generated child sex images have been found online.

Child safety experts are growing increasingly powerless to stop thousands of “AI-generated child sex images” from being easily and rapidly created, then shared across dark web pedophile forums, The Washington Post reported.

This “explosion” of “disturbingly” realistic images could help normalize child sexual exploitation, lure more children into harm’s way, and make it harder for law enforcement to find actual children being harmed, experts told the Post.

Finding victims depicted in child sexual abuse materials is already a “needle in a haystack problem,” Rebecca Portnoff, the director of data science at the nonprofit child-safety group Thorn, told the Post. Now, law enforcement will be further delayed in investigations by efforts to determine if materials are real or not.

Harmful AI materials can also re-victimize anyone whose images of past abuse are used to train AI models to generate fake images.

[jetpack_subscription_form title="Subscribe to GreatGameIndia" subscribe_text="Enter your email address to subscribe to GGI and receive notifications of new posts by email."]

“Children’s images, including the content of known victims, are being repurposed for this really evil output,” Portnoff said.

Normally, content of known victims can be blocked by child safety tools that hash reported images and detect when they are reshared to block uploads on online platforms. But that technology only works to detect previously reported images, not newly AI-generated images. Both law enforcement and child-safety experts report these AI images are increasingly being popularized on dark web pedophile forums, with many Internet users “wrongly” viewing this content as a legally gray alternative to trading illegal child sexual abuse materials (CSAM).

“Roughly 80 percent of respondents” to a poll posted in a dark web forum with 3,000 members said that “they had used or intended to use AI tools to create child sexual abuse images,” ActiveFence, which builds trust and safety tools for online platforms and streaming sites, reported in May.

While some users creating AI images and even some legal analysts believe this content is potentially not illegal because no real children are harmed, some United States Justice Department officials told the Post that AI images sexualizing minors still violate federal child-protection laws. There seems to be no precedent, however, as officials could not cite a single prior case resulting in federal charges, the Post reported.

As authorities become more aware of the growing problem, the public is being warned to change online behaviors to prevent victimization. Earlier this month, the FBI issued an alert, “warning the public of malicious actors creating synthetic content (commonly referred to as ‘deepfakes’) by manipulating benign photographs or videos to target victims,” including reports of “minor children and non-consenting adults, whose photos or videos were altered into explicit content.”

These images aren’t just spreading on the dark web, either, but on “social media, public forums, or pornographic websites,” the FBI warned. The agency blamed recent technology advancements for the surge in malicious deepfakes because AI tools like Stable Diffusion, Midjourney, and DALL-E can be used to generate realistic images based on simple text prompts. These advancements are “continuously improving the quality, customizability, and accessibility of artificial intelligence (AI)-enabled content creation,” the FBI warned.

A study conducted by the University of Georgia has found that employees who work with AI are more likely to experience loneliness and engage in drinking behaviors.

Read more

GreatGameIndia is being actively targeted by powerful forces who do not wish us to survive. Your contribution, however small help us keep afloat. We accept voluntary payment for the content available for free on this website via UPI, PayPal and Bitcoin.

Support GreatGameIndia

Leave a Reply