‘AI-generated images of child sexual abuse prolong victims’ suffering’

Two years ago, when offenders began using artificial intelligence (AI) to generate images of children being sexually abused, the images found by the Internet Watch Foundation (IWF) were still crude, even though their cruelty was no less disturbing than that of traditional images.

Since then, the threat has grown considerably. Thanks to rapid advances in technology, the most convincing AI-generated content can now be visually indistinguishable from real images and videos. This situation is all the more alarming because they can be produced at an astonishing scale and speed. A single prompt can generate at least 50 images, each taking barely 20 seconds to create. Furthermore, this technology is widely accessible, allowing almost anyone to secretly create high-quality, deeply disturbing content.

In some cases, existing images depicting children being sexually abused have been used to train AI models, embedding real trauma into artificial content. This allows predators to manipulate images of actual abuse to suit their fantasies and preferences. As a result, the victims’ suffering is prolonged, as these new versions can be shared endlessly.

New avenues for predators

Research has shown that there is a real and undeniable link between viewing child sexual abuse material and committing offenses. A study conducted among dark web users found that 40% of offenders reported that they would seek to make contact with a real child after having viewed an image of some form of sexual violence.

Read more Subscribers only Surge in AI-generated child sexual abuse images alarms advocacy groups and investigators

By investigating a dark web forum, IWF analysts discovered that more than half of the AI-generated child sexual abuse images depicted children of primary school age (7 to 10 years old); 143 images showed children aged 3 to 6 years old; and two depicted babies. Over 20% of these images were classified as category A under UK law – the most severe, involving rape, torture or bestiality.

AI is also opening up new avenues for predators to approach children and to extort them. IWF analysts found a « manual » for sexual extortion explaining how to coerce children into producing images, suggesting the use of AI to fake visuals and entrap victims. In some cases, children themselves risk becoming offenders by using AI tools to create content that can « undress » their peers.

You have 37.5% of this article left to read. The rest is for subscribers only.

Soyez le premier à commenter

Poster un Commentaire

Votre adresse de messagerie ne sera pas publiée.


*