11.9 C
London
Sunday, April 28, 2024

Could AI-Generated Porn Help Protect Children?

Now that generative AI models can produce photorealistic, fake images of child sexual abuse, regulators and child safety advocates are worried that an already-abhorrent practice will spiral further out of control. But lost in this fear is an uncomfortable possibility—that AI-generated child pornography could actually benefit society in the long run by providing a less harmful alternative to the already-massive market for images of child sexual abuse.

The growing consensus among scientists is that pedophilia is biological in nature, and that keeping pedophilic urges at bay can be incredibly difficult. “What turns us on sexually, we don’t decide that—we discover that,” said psychiatrist Dr. Fred Berlin, director of the Johns Hopkins Sex and Gender Clinic and an expert on paraphilic disorders. “It’s not because [pedophiles have] chosen to have these kinds of urges or attractions. They’ve discovered through no fault of their own that this is the nature of what they’re afflicted with in terms of their own sexual makeup … We’re talking about not giving into a craving, a craving that is rooted in biology, not unlike somebody who’s having a craving for heroin.”

Ideally, psychiatrists would develop a method to cure viewers of child pornography of their inclination to view it. But short of that, replacing the market for child pornography with simulated imagery may be a useful stopgap.

There is good reason to see AI-generated imagery as the latest negative development in the fight against child sexual abuse. Regulators and law enforcement already comb through an enormous amount of images every day attempting to identify victims, according to a recent paper by the Stanford Internet Observatory and Thorn. As AI-generated images enter the sphere, it becomes harder to discern which images include real victims in need of help. Plus, AI-generated images rely on the likenesses of real people or real children as a starting point, which, if the images retain those likenesses, is abuse of a different nature. (That said, AI does not inherently need to train on actual child porn to develop a simulated version of it, but can instead combine training on adult pornography with its training on the likenesses of children.)

Finding a practical method of discerning which images are real, which images are of real people put into fake circumstances, and which images are fake altogether is easier said than done. The Thorn report claims that within a year it will become significantly easier for AI to generate images that are essentially indistinguishable from real images. But this could also be an area where AI might play a role in solving a problem it has created. AI can be used to distinguish between different forms of content, thereby aiding law enforcement, according to Rebecca Portnoff, head of data science at Thorn. For example, regulators could require AI companies to embed watermarks in open-source generated image files, or law enforcement could use existing passive detection mechanisms to track the origin of image files.

When it comes to the generated images themselves, not everyone agrees that satisfying pedophilic urges in the first place can stem them in the long run.

“Child porn pours gas on a fire,” said Anna Salter, a psychologist who specializes in the profiles of high-risk offenders. In Salter’s and other specialists’ view, continued exposure can reinforce existing attractions by legitimizing them, essentially whetting viewers’ appetites, which some offenders have indicated is the case. And even without that outcome, many believe that viewing simulated immoral acts harms the actor’s own moral character, and thus perhaps the moral fabric of society as well. From that perspective, any inappropriate viewing of children is an inherent evil, regardless of whether a specific child is harmed. On top of that, the potential normalization of those viewings can be considered a harm to all children.

Latest news
Related news