Actions

Prosecutor: AI-generated images of children being sexually abused found in Boone County case

Law enforcement concerned about 'growing' problem
Jason Chambers leaves the Boone County Justice Center on Oct. 18 after pleading guilty to possessing images of naked children being sexually abused
Posted
and last updated

BURLINGTON, Ky. — Jason Chambers stored hundreds of images of naked children being sexually abused on USB drives he kept in his apartment in Burlington, Ky., according to the Boone County Commonwealth Attorney's Office.

Chambers pleaded guilty on Oct. 18 to 13 counts of Possession of Matter Portraying a Sexual Performance by a Minor - often described as child pornography.

Jason Chambers pleaded guilty last week in Boone County Circuit Court to possessing images of children being sexually abused
Jason Chambers pleaded guilty last week in Boone County Circuit Court to possessing images of children being sexually abused

Most of those criminal counts involved victims under the age of 12, according to Chambers' indictment.

According to Boone County Asst. Commonwealth's Attorney Sierra Merida, the National Center on Missing and Exploited Children (NCMEC) identified 9 victims in Chambers' collection of images and videos.

Chambers also had "hundreds of images of anime and AI images and videos depicting the of sexual abuse of children," according to a court filing by Merida.

But Chambers wasn't charged with possessing anime and AI-generated images of naked children being sexually abused.

"These materials are not illegal," Chambers' defense attorney Heather Herald wrote in a court filing opposing the prosecution's motion to have the AI images and anime introduced into evidence. "AI-generated images and anime are inherently more severe. AI and animation can create scenarios that are extreme and well beyond anything that happen in reality."

Boone County Circuit Judge Richard Brueggemann admitted some of the anime, AI-generated images and "written child erotica stories" as evidence, according to the judge's order.

Six weeks after the judge issued that order, Chambers pleaded guilty.

"We've got to start asking more serious questions about this," Ohio Attorney General Dave Yost told the I-Team during a recent interview about AI-generated images of Child Sexual Abuse Material (CSAM). "Those are feeding the most perverse, twisted thoughts, desires and fantasies out there."

Ohio Attorney General Dave Yost
Ohio Attorney General Dave Yost

In September, the National Association of Attorneys General sent a letterto leaders in Congress urging them to establish an expert commission to examine the issue and propose solutions.

"We are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult," according to the group's letter. "The technology can be used to exploit children in innumerable ways."

According to the AG's letter, so-called 'deepfakes' can be created showing the faces of otherwise unvictimized children edited into sexually explicit material.

"AI has made it quick and easy," according to the group's letter.

The Attorneys General urged Congress to expand existing restrictions on CSAM to explicitly cover AI-generated CSAM.

"We are engaged in a race against time to protect the children of our country," according to the letter. "Now, is the time to act."

In July, the Internet Watch Foundation(IWF) — a nonprofit based in the United Kingdom — reported finding AI-generated images that were "astoundingly realistic."

IWF is the UK organization responsible for "finding and removing" CSAM from the internet, according to the group's website.

Dan Sexton, IWF's Chief Technology Officer, told the I-Team the problem is getting worse.

"You can create whatever you want," Sexton said.

Sexton said IWF has found online chat groups providing guidance on how to use AI to create and share more extreme sexually explicit images of children.

"They're using AI generation to take a victim's likeness and train models on that victim to create more imagery of the abuse," Sexton said.

Internet Watch Foundation Chief Technology Officer Dan Sexton
Internet Watch Foundation Chief Technology Officer Dan Sexton

According to news reports, children in Spain used AI to create sexually explicit images of classmates as young as 12.

Then, they shared them on private messaging services.

"The tech industry, governments and regulators have to be creating safer products," Sexton said. "Safe spaces, and insuring there are good laws and regulation to ensure that there are minimum standards across all platforms."

Under U.S. federal law, Electronic Service Providers (ESPs) are required to report apparent child pornography on their systems to NCMEC’s CyberTipline, according to the NCMEC website.

Millions of CSAM images and videos have 'hash marks' or digital fingerprints that enables investigators and internet service companies to identifying illegal images of children.

"To date, over 1,400 companies are registered to make reports to NCMEC’s CyberTipline," according to the NCMEC website. "These companies also receive notices from NCMEC about suspected CSAM on their servers."

Since 2002, NCMEC's Child Victim Identification Program has identified more than 19,100 children in child sexual abuse material, according to the website.

Possession and distribution of AI-generated CSAM is a crime in some countries.

In South Korea and Canada, men were recently sentenced to prison for using AI to create exploitative images of children.

"The use of deepfake technology in criminal hands is chilling," Provincial court judge Benoit Gagnon wrote in a ruling on the Canadian case. "The type of software allows crimes to be committed that could involve virtually every child in our communities."

In the UK, online providers face possible fines and criminal charges if they fail to comply with new online safety rules designed to reduce and prevent child sexual exploitation.

Yost told the I-Team he believes companies that use AI to create images should be required to have a permanent watermark on material created with their tools.

He also said lawmakers need to address the issue as soon as possible.

"This is going to lead to a new generation of child abuse like we haven't seen before and we already see way too much," Yost said.

Congressional response

On Tuesday, Oct. 17, the I-Team emailed the press offices of every U.S. Senator and U.S. Representative serving communities in the Tri-State.

In our email, we included a copy of the AGs' letter to Congressional leaders, requested a response to the letter and specific actions that the Senator or Representative believes Congress should take to address the issue.

We told them our deadline was last Friday at 5 p.m.

Only three met our deadline; Rep. Greg Landsman (D- Ohio), Sen. Sherrod Brown (D-Ohio), and Sen. Todd Young (R-IN).

Rep. Landsman is the only one who pledged to support any of the Attorneys General's specific recommendations.

"He’s 100 percent supportive of the proposed commission," Landsman's Communications Director Alexa Helwig wrote in an email response to the I-Team. "Greg believes with expertise from several partners, we can get ahead of those who want to harm children in some of the most heinous ways possible."

In his statement, Sen. Brown wrote, "As technology develops, Congress needs to ensure we are putting safe guards in place that prevent adults from using new technology to harm children."

Sen. Young supported working with the AGs on the issue.

"Senator Young believes this letter raises a very important issue that Congress should carefully consider as any federal legislation on AI is drafted and that Congress should collaborate with state attorneys general on this issue," Young's Communications Director Matt Lahr wrote in a response to the I-Team.

Yost, President of the National Association of Attorneys General, said lawmakers should consider regulations that would make it much easier for law enforcement to identify sources of AI-generated CSAM.

"You need to have a non-removable watermark that says where this stuff was created and by what product," Yost said.

Yost said he planned to contact state lawmakers about the threat posed by CSAM that, according to experts, can now be created within minutes using AI.

“Instantly, there are copies floating around the internet multiplying like micro-organisms in the night," Yost said. "It's impossible to stop."

The National Center for Missing & Exploited Children’s Call Center is open 24 hours a day, 7 days a week.
1-800-THE-LOST (1-800-843-5678)

You can report the sexual exploitation of children by using NCMEC's CyberTipline.
https://report.cybertip.org/