MoneyConsumerDon't Waste Your Money

Actions

Listen how AI can clone your voice, use it in phishing scams

Reporter's words recorded, then turned into the "Grandparent scam"
Artificial,Intelligence,(ai),machine,Learning,With,Data,Mining,Technology,On,Virtual
Posted
and last updated

If you think you have a good grasp of what a phishing email looks like, or what a scam call sounds like, don't be so sure.

Technology is making it a lot easier to mimic what a person sounds like.

As a result, the so-called "grandparent scam" is getting more sophisticated with help from AI, artificial intelligence.

We got to see first hand how it can work, and it is a bit disturbing.

You may be familiar with the scam, where scammers trying to impersonate a close relative -- making it sound like there's a crisis -- to get money.

Now, with help from AI, fooling parents and grandparents is easier than ever.

"Hi Grandma, I need your help!"

Dave Hatter of Intrust IT Security says scammers can now use free AI programs to capture a relative's voice, then use it in a scam call.

"With as little as 3 seconds of audio you can clone someone's voice," he said.

To demonstrate, he recorded me reading a line of text, using one of several free AI programs available online.

"Once upon a time, the King's youngest son ....." I read, which is a short excerpt out of a children's story.

Then he typed a common phrase you’d hear from a scammer, and the AI program had my voice say it.

"Hi Grandma, it's John calling. I need your help," the computerized voice said, sounding like a slightly synthesized version of my voice.

It's not perfect-- but to your 90-year-old grandmother, it could convince her that you were in trouble and needed her to wire you some money.

"If I got a voice mail from you, John, I would assume it was you talking to me," Hatter said.

Scammers could also use it to mimic a teen's voice for the "kidnapping scam," where it appears a child has been kidnapped.

Pete Nicoletti with the cyber security firm Check Point says a scammer only needs a snippet of video from social media.

"They can download a speech that you gave, from a Linkedin profile, that you did something in public," he said.

Or they could take your voice off an Instagram or TikTok video, as they only need a few seconds.

Then they can have your voice calling your grandparents, or parents, saying that you are in desperate trouble and need money.

AI helping with email scams too

AI is also making it harder to spot phishing emails.

Nicoletti says AI tools clean up the grammar and spelling errors that typically help you spot a scam.

"It lowers the bar for criminals to create phishing emails en masse," he said.

So the next time you get a call or message from a loved one, Hatter says, be careful.

"You assume it must be them, it sounds so much like them," he said.

If they claim they are in trouble, make sure it's really them by calling or texting them directly.

And alert your older relatives to this new version of an old scam.

That way you are not fooled, and you don't waste your money.

_________________

"Don't Waste Your Money" is a registered trademark of Scripps Media, Inc. ("Scripps").

Follow John:

For more consumer news and money saving advice, go to www.dontwasteyourmoney.com

Don't Waste Your Money promo

Your source for deals, product reviews and consumer news.

Have a problem?
Send us an email, at jmatarese@wcpo.com or Taylor.Nimmo@wcpo.com or message John on Facebook and Taylor on Facebook.