NewsStateState-Ohio

Actions

Proposed Ohio bill would require disclaimers for political AI deepfakes, add criminal penalties

AI is being used to create deepfake images of the war in Gaza
Posted

COLUMBUS, Ohio — A bipartisan bill introduced in the Ohio House aims to combat election misinformation by requiring disclaimers on artificial intelligence-generated content. Creators could get a criminal charge, as well.

Before the New Hampshire primary election, a robocall went out with seemingly President Joe Biden's voice urging voters to "save your vote for the November election."

Although it sounded like Biden, it wasn’t actually him. It was a deepfake — an AI-generated audio clip mirroring the president’s voice.

This doctored audio shows a larger problem impacting politicians and has raised red flags for state Rep. Joe Miller (D-Amherst).

"The preservation of our democratic integrity is so crucial," Miller said. "Ensuring accuracy and authenticity of information that's related to the election cycle or campaigns is fundamental."

He has introduced House Bill 410 to combat this type of misinformation, regulating deepfake media meant to influence elections. It would require all images and videos to come with a watermarked disclaimer saying it is AI. Audio would have a spoken statement.

"We need people to realize this piece of information has been put out and it has been manipulated," he said.

The bill is also meant to stop people from becoming disengaged in the political process. WCPO's sister station, WEWS, found that negative attack ads don’t make people support the opponent, but rather make voters less likely to show up on Election Day.

"They say ‘everyone lies in these ads and everyone's terrible, so why should I bother to participate?'" Ohio University professor Benjamin Bates said in 2022.

H.B. 410 is bipartisan, but Case Western Reserve University technology law professor Eric Chaffee said this would be incredibly tough to enforce.

"Requiring these disclaimers, in fact, may be spoiling that art or forcing people to say things that they don't want to say," Chaffee said. "There are certainly First Amendment concerns."

Victims of AI would also have civil recourse, being able to sue the creator and the social media company that allows it on the platform.

There is also a criminal penalty. If a person is found guilty of creating this content, they would get a first-degree misdemeanor.

The bill is written so vague that it's unclear if it could hold up in court, Chaffee argued.

"It's an extreme amount of liability for people who are creating things that may be viewed as being deep fakes, [and there's a] relatively ambiguous definition of what is a deep fake under this bill," the professor added.

The legislation needs work, Miller agreed, but that is why there is the committee process. He was just hoping to start the conversation.

"This brings awareness to it, so people are a little bit more skeptical when they see something flash across their social media posts," the Democrat said.

Gov. Mike DeWine's team said he supports protecting Ohioans but also wants them to benefit from new technology.

Lt. Gov Jon Husted just announced an AI Toolkit program for K-12 educators. One of its main focus is to help with literacy. But, this technology isn't always beneficial, he said.

"AI is a technology that can be used for both good and evil," Lt. Gov. Jon Husted said. "I think it’s very important for the legislature and all of us to educate ourselves as best we can on how to make the technology safe for our children and society. I encourage the discussion."

A bipartisan group of state senators is also seeking to prevent malicious AI.

RELATED | After Taylor Swift AI images circulate online, Ohio lawmakers propose law against 'malicious' deepfakes

State Sen. Louis W. "Bill" Blessing, III (R-Colerain Township) proposed S.B. 217 to criminalize the use of artificial intelligence in generating sexually explicit material of children — in addition to adults who did not give their consent.

Both bills will likely be heard in the upcoming months.