CINCINNATI — A 25-year-old Florence man is facing 11 charges for allegedly posing as a teenage boy on Snapchat to lure teen girls into sending him nude photos of themselves, but he's not the only Tri-State resident accused of using popular social media apps to sexually exploit minors.
Brady McMillan was arrested and indicted in February following a months-long investigation led by the Boone County Sheriff's Office.
McMillan engaged in several graphic conversations with at least three teen girls in the spring of 2022, according to court documents. Another victim was allegedly 12 years old. His next court appearance is scheduled for Nov. 8.
In Independence, investigators allege a 19-year-old solicited sexually explicit photos of a 15-year-old girl, also over Snapchat, before harassing and intimidating the child and sending her photos to other people.
Tyler Sizemore was also arrested and indicted in February and is facing 24 counts, including promoting a sexual performance by a minor, use of an electronic system to induce or procure a minor to commit a sexual offense, possession of a matter portraying a sexual performance by a minor and terroristic threatening.
In his criminal complaint at the time of his arrest, investigators suggested Sizemore could have victimized other teen girls.
"They can now do these to like hundreds of kids a month if they wanted to — if they have the dedication to — and a lot of them do have the dedication to do that because it's so — there's so many people on the internet and if you just go on these social sites, you can find so many children," said Ani Chaglasion, who knows all too well the dark reality of the internet.
She hadn't even started high school when she said she began an online relationship with a man through the chat site Omegle. That began a nearly four-year ordeal of grooming and sexual exploitation, she said.
"It wasn't until my junior year of high school when I found out that he had been doing this to a multitude of other victims and that he had actually, you know, sent my content out to other minors," she said.
Chaglasion said though she no longer communicates with her abuser, she will forever be a victim.
"I didn't consent to any of this happening," she said. "A lot of the content, I wasn't even aware that it was being recorded."
Once that first image was captured, Chaglasion said she was added to a widening web of child sexual abuse material. Commonly and legally referred to as child pornography, CSAM is multiplying by the hundreds every second.
According to the National Center for Missing & Exploited Children, there were more than 32 million reports of suspected child sexual exploitation in 2022, up from the 29.3 million reports in 2021.
While NCMEC's CyberTipline receives reports about multiple forms of online child sexual exploitation, reports from CSAM, make up the largest reporting category. Over 99.5% of the reports received by the CyberTipline last year regarded incidents of suspected CSAM.
"You know, it used to be difficult to come across this type of content," NCMEC child advocate Callahan Walsh said.
Not only is CSAM a growing threat, but it's also evolving.
Walsh said in the past, CSAM would only be found on the dark web, but now online predators are finding more accessible ways to create and share the images.
Now it's hiding in plain sight, not just in the corners but also on the surface.
"We're seeing it right there on the social media platforms that we all use daily. And these social media platforms don't want this content on their platform," he said. "They simply want it off the platform. They don't want their platforms to be used to exploit children."
Social media apps more commonly being flagged in reported CSAM cases include Facebook, Instagram, Reddit and Discord, but as signaled in Chaglasion's case, as well as the pending cases in Boone and Kenton Counties, online predators are seemingly casting their webs wider.
According to the U.S. Attorney's Office of the Southern District of Ohio, in April a 62-year-old Trenton man was charged with distributing child pornography after allegedly using an encrypted instant messaging application to pursue access to adults with access to minor children for the purposes of sharing sexually explicit images and engaging in sexual acts with children
NCMEC is also seeing an increase in "self-produced content."
"Where a child will be coerced, will be manipulated into thinking that they're sending this nude image to appear to perhaps a new romantic, you know, relationship that they're in, that they've made online," he said. "And so children, uh, reluctantly are oftentimes coerced by many times adults posing as children.
Walsh said predators can also take online sexual exploitation further.
"Sextortion — a word we didn't even have a few years ago, where an individual will get their hands on a sexually explicit image of a child and many times that's, you know, grooming them and building that relationship and manipulating that child until they self produce that image," Walsh said. "But once that predator has that image, they then use that to blackmail the child for either more sexually explicit content or for monetary goods and services."
There's also the threat of online predators accelerating from grooming and sextorting young victims to luring them into physical sexual encounters.
Jason Thomas Gmoser, 43, of Hamilton was sentenced to 30 years in federal prison earlier this year for producing child pornography. Prosecutors said he used a webcam while playing Playstation games online to film and record sexually explicit videos of himself and minor males.
Gmoser traveled outside of Ohio to the 8-year-old victim’s home on multiple occasions, according to the U.S. Attorney's Office of the Southern District of Ohio. Gmoser also took the boy to the movies, out to eat, and purchased items for him and his family, including a PlayStation.
In another case, a Michigan man is accused of driving down to the Cincinnati area to have sex with a minor girl after soliciting her for sexually explicit content.
Jamari Chatman, 21, of the Detroit area began chatting with the girl, who lives in Springdale and who he thought was 13, on the popular chat app Discord, investigators allege. She was only 11.
Chatman's criminal complaint says the two engaged in graphic conversations via TikTok and Instagram as well. The graphic conversations escalated to Chatman's attempted sexual encounter in October 2022, but the girl never showed up, court records show.
"He also produced child pornography with her," a prosecutor is heard telling the judge in an audio recording of a 2022 criminal detention hearing obtained by WCPO.
"We could put every single trooper in the state of Kentucky working child sexual abuse material investigations and complaints in reference to child sexual exploitation, and we still couldn't cover it. We have to spread awareness that this stuff is going on to try to prevent," said Kentucky State Police Sergeant Zack Morris, who works with the Internet Crimes Against Children Task Force.
The trooper said parents must stay vigilant when protecting children from online dangers.
"Are we allowing our kids to take their electronics to bed with them? Keep them in their rooms?" he said.
For Morris, that means difficult conversations and watchful eyes.
"We should do a little research. We need to become — parents need to become their own investigators in their home," he said. "They need to look into these different apps, see how they work, see what's, you know, the good things, the good, the bad, the pros, the cons, and using it."
Do your kids spend hours on their phones when they should be asleep? Morris said that should be a red flag. Do they use encrypting web browsers designed to conceal their identity? That's another red flag.
Despite efforts to prevent CSAM, the images never truly go away. They live on forever, in whatever capacity predators are able to copy, save and distribute the material.
"Thousands and thousands, if not hundreds of thousands of people, you know, seeing those photos. They can just take a screenshot and keep that in their phone forever. And then those photos get scraped and they're posted on other websites automatically almost," Chaglasion said. "And so there's no way to even get those photos down. Ever and the photos that people have sent out personally to other people when they're distributed. It's like an impossible thing to it's like it's immortal like there's nothing you can do once it's out, it's out."
But there are efforts to make CSAM harder to find.
"We create a digital fingerprint for every image and video that we receive that is depicting the child sexual abuse," Walsh said. "It's a string of numbers and letters that's unique to that image or video."
Walsh is talking about image hashing. NCMEC's been doing it for years, sharing identifying tags with big tech companies, including Google and Microsoft, so they can search their own data bases for those images.
"If there is a matching, matching hash, that means there's an image of child sexual abuse material known to the National Center, and then they'll take actions to take that image or video down," he said.
NCMEC analysts review suspected CSAM with information about the type of content, the estimated age range of the people seen and other details that help law enforcement prioritize the reports for review.
The center tagged more than 13.4 million files in 2022 and added 1.1 million hash values to its list of more than 6.3 million hash values of known child sexual abuse material, according to its website.
To double down on that effort, NCMEC also launched its new online tool Take it Down. It's a free service for anyone who believes they may have sexually explicit content or a semi-nude image of themselves on the internet that was taken before they were 18 years old.
"They select images on their phone — images or videos right from their device — that we will create a hash for that digital fingerprint. That image and video never leaves their phone and we can never recreate that image from the hash either," Walsh said. "But just with that digital fingerprint, we can share that out with our partners and take it down and they can monitor their services, even encrypted platforms to see if that hash value matches another hash on their system."
Walsh said since the tool's launch in December 2022, NCMEC's already received more than 34,000 individual reports.
But sometimes waiting for justice can add to the turmoil haunting CSAM victims, because oftentimes, despite law enforcement and advocate efforts, many victims never get it.
"You can hire as many attorneys as you want, and go into any as many legal litigation battles as you want, and it's really hard when you're a child and then you have to hire an attorney, you have to tell your parents about it right? I think that's a really frustrating issue," Chaglasion said. "This is not something that children to take the burden of. It's a very, very complex legal process, and for one, for adults to be able to mitigate, let alone literal children who are 11, 12, 13, 15, even 18-year-olds, it's a very impossible battle to win."
Chaglasion said she's still going through litigation in efforts to hold her abuser, and others who have viewed and distributed her photos, accountable. Her journey inspired her to co-found Protect Kids, Not Abusers, a coalition pushing for legislative change.
Chaglasion's efforts focus on getting laws rewritten to better protect young victims.
"Pedophiles use (Omegle) to target children, and there's a lot of other websites that are being used, Roblox or just kids games, Discord, and Reddit, right?" Chaglsion said. "I've reached out to the team multiple times. There are Reddit forums that should not exist. Even Pornhub does not have like "Daddy Dom, Little Girl?" They don't allow certain keywords. And that was because of CSAM victims who have come out and advocated hundreds of millions of times and sued Pornhub because their content has made porn of millions of dollars."
Her coalition already got one bill passed in Chaglasion's home state of California.
SB-558 now extends the statute of limitations from 10 years after the production of CSAM to when the victim turns 40 years old, or 5 years after they realize the material has been produced, whichever date comes later. The bill also amends the definition of “distribution” of CSAM in the California Penal Code to include “public display.”
"A lot of children get their CSAM distributed before they're 8 years old, or like you can get your CSAM distributed, but not know anything about it. And to know that children — the statute of limitations were expiring before they were even of legal age to hire an attorney is crazy," Chaglasion said.
Her coalition is now setting its sights on the federal Earn It Act, which if passed, would allow victims to sue third-party websites that allow or ignore child sexual abuse material on their sites.
"It's a very important one, and I hope there's movement on it this year," she said.
The fight to prevent and remove child sexual abuse will never end but as Chaglasion and others continue work to unravel the web, they want victims to know:
"You are not at fault," Chaglasion said.
Watch Live: