The generative AI wave has brought with it a deluge of fantastical, sometimes freakish new imagery, along with something much darker: a growing volume of sexually explicit images of children created from innocent family photos.

Thanks to the widespread availability of so called “nudifier” apps, AI generated child sexual abuse material (CSAM) is exploding, and law enforcement is struggling to keep up.

Mike Prado, a deputy chief at the DHS ICE Cyber Crimes Unit, says that he’s seen cases where images of minors posted to social media have been turned into CSAM with AI. Joe Scaramucci, a reserve deputy at McLennan County Sheriff’s Office and director of law enforcement engagement at anti-trafficking nonprofit Skull Games, said the use of AI to turn social media images into sexually explicit images of children was “exploding.”

“This is, unfortunately, one of the most significant shifts in technology that we’ve seen to facilitate the creation of CSAM in a generation,” he told Forbes.

“The irony of AI CSAM is that even the victim might not realize they’re a victim.”

Child exploitation investigator

And worse, Prado also says predators have taken photos of children on the street to modify into illegal material. As Forbes reported last year, one man took images of children at Disney World and outside a school before turning them into CSAM.

“We see it occurring on a more frequent basis, and it’s growing exponentially,” Prado told Forbes. “It’s no longer on the horizon. It’s a reality that we are dealing with every day.”

Last year a previously convicted sex offender was accused of taking a parent’s photos of their child from Facebook and posting them to a pedophile group chat on encrypted messaging app Teleguard claiming they were his stepchildren. There, a source familiar with the investigation told Forbes, other members of the group turned them into explicit sexual imagery. The man was subsequently arrested and charged with possessing thousands of pieces of CSAM, hundreds of them generated by AI.

The pace at which AI is evolving and the absence of guardrails around some models, as well as the sheer scale of the proliferation of AI-generated CSAM, have made policing substantially more difficult. “We have observed AI-generated CSAM created by generative AI platforms, including face-swapping apps, body-swapping apps, and ‘undress’ apps,” said Madison McMicken, a spokesperson for the Office of the Utah Attorney General. “Images are often pulled from social media and converted into CSAM.”

The owners of those images may forever be left in the dark, though. As one child exploitation investigator, who was not authorized to talk on record, told Forbes, “The irony of AI CSAM is that even the victim might not realize they’re a victim.”

More On Forbes