Pedophiles use AI to turn celebrities into children  

The dark side of technology

The Internet Watch Foundation (IWF) detected on dark web forums, images of famous actors and singers as children, being shared between pedophiles with the help of artificial intelligence that often turns these images into sexual ones. Hundreds of images also of real victims of child abuse are generated by AI.

The IWF is trying to alert the world to the growing danger of pedophiles using AI systems to create images from simple written commands. He had already warned that since powerful imaging systems became more widely available to the public, they had the potential to be abused to produce illegal images.

The IWF report describes how its researchers spent a month capturing AI images on a child abuse website on the darknet and found 3,000 synthetic images that could be considered illegal under European Union law. Analysts are warning that there is a new trend among pedophiles, where they take a photo of known child abuse victims and then reproduce many more with different sexual themes.

A single file found contained 501 images of an actual victim who was 9 years old when she was abused. Also in the file was a tuned AI model that would allow other pedophiles to produce more images of the victim.

The IWF emphasizes that the images are extremely realistic and an untrained eye cannot tell them apart from the real thing.

The researchers saw images of mostly female actors and singers, who had been transformed into children using special software. Of the 11,108 AI images investigated by the IWF, 2,978 depicted actual victims of child abuse, 1,372 depicted children aged 7-10 and 143 depicted children aged 3-6.

Our worst nightmares have come true. Earlier in the year we warned that AI imagery would soon be indistinguishable from real images of children suffering from child abuse and that we would start to see such images in much greater numbers. We are now past that point.

Although AI images do not directly harm children, the IWF stresses that it will normalize pedophilic behavior and deprive police of resources to investigate children who do not exist. In addition, the IWF found hundreds of images of two girls doing a non-nude photo shoot at a modeling agency, which were processed with AI to turn into sexual abuse scenes.