Skip to content

Scams and AI: Artificial Intelligence is the Scammer’s New Tool

AI Voice Scams

Grandparents and parents are being tricked with fake voices as cybercrooks combine family emergency scams with AI.

You can’t believe what you see or hear anymore, not with the very convincing combination of scams and AI. Artificial intelligence is the con artists newest tool in their bag of tricks.

According to a Wall Street Journal article (Oct. 6, 2023) that cited a research from Microsoft, con artists have upped their game, pretty successfully, to make their scams more believable and perhaps more effective.

Actually, fake (doctored-up) photos and fake (replicated) voices are already good enough to fool most people. However, when another component is added to the mix—in this case voices that sound exactly like someone a victim knows—panic, high emotions and rash actions can follow. Check out this important and helpful infographic on how to avoid being scammed by AI…

AI Voice Scam Infographic

Immediately after reading the article, reach out to your family and come up with a special, secret code word that everyone will know and easily remember.

  • That way, if you ever get a call from a family member that seems like a drastic emergency, one thing you can do is to ask the frantic caller to tell you the code word.
  • A scammer will not know and pretend he can’t remember.

A true story how family emergency scams have evolved.

Jennifer DeStefano in California received a phone call this year, from a number she didn’t recognize…but she definitely recognized the voice of her oldest daughter, who was scared and crying.

“Mom, these bad men have me,” the voice said. “Help me. Help me. Help me.” Her daughter, who was only 15, was away on a ski trip with her father. At least she was supposed to be.

That wasn’t the only voice Jennifer heard, though. Because next a man came on the phone. Here’s what he said, according to what Jennifer can recall now.   

“I have your daughter. If you call anybody, if you call the police, I’m gonna pump your daughter so full of drugs, I’m gonna have my way with her. Then I’m gonna drop her in Mexico and you’re never gonna see your daughter again.” DeStefano said.

The male caller also told Jennifer he wanted $1 million to keep her daughter safe.

Jennifer was in a full panic—what parent of a teenage daughter who was kidnapped wouldn’t be?

While she was still on the phone, and very scared, Jennifer opened her front door, walked outside and started screaming for help. She was afraid a parent’s worst nightmare had happened to her.

Fortunately, her scream drew attention.

Hear something. Do something.

A neighbor heard Jennifer scream and called 9-1-1. The 9-1-1 dispatcher, thankfully, discerned right away that there might be a scam in play and had the wisdom to tell Jennifer to call her daughter directly.

Jennifer wasn’t able to reach her daughter with her call, but she was able to get through to her husband.

“Yes, Briana is here and she’s okay,” the husband told her.

A disaster was averted. Two, actually.

  • To begin with, her daughter was safe and not in the clutch of kidnappers.
  • Just as important, Jennifer didn’t let the “kidnappers” extort her for $ 1 million…or more.

Scams and AI. Adding fake voices to phone calls, with success.

Scams of all kinds are on the rise. Romance scams, fake job scams and shopping scams are a few. Imposter scams have exploded in number as well. That’s where a con artist pretends to be from your bank, the IRS…or a family member in trouble.

That is what Jennifer DeStefano experienced. Family emergency scam, enhanced by artificial intelligence, which is now available to anyone.

Jennifer was lucky.

Her scream alerted her neighbor, who called the police, who told Jennifer that it might be a scam—and they were right. That’s not often the case. Billions of dollars are lost to scams every year, especially to family emergency scams.

Family emergency scams are perfect for artificial intelligence.

It’s important to remember this fact: even before scammers started using AI to refine their schemes, the family emergency scam was already one of their favorite strategies.

Artificial intelligence technology and easily available apps have allowed them to take it to the next level. And here’s the scary part: All a scammer needs is a short sample of someone’s voice—typically from a video posted online or, in some cases, perhaps just the message on their phone saying, “I’m not here, leave a message.”

If you go on YouTube or explore stories about fake voice messages and AI, you’ll see plenty of samples and examples of people’s voices being replicated perfectly.

As if a frightening phone call isn’t enough to get you panicked, what’s worse is that talented and tech-savvy scammers can use a cloned voice in real time.

  • In other words, in addition to using pre-recorded clips of a cloned voice of someone you care about, there are tools now that allow them to carry on a conversation, in someone’s voice, to fool the victim the scammer is targeting.
Using AI to replicate someone's voice and commit extortion.

It’s very hard to keep a level head in the midst a scary situation.

If you’ve watched a YouTube video or seen a news segment on family emergency, you can only imagine how you’d feel if this kind of scam happened to you.

Think about this, for a minute:

Just try to imagine, right now, a family kidnapping actually happening to someone close to you. Think about how panicked you’d be if it were real!

Then realize that if you did receive a call from someone (a spouse, child, or grandchild) pleading for help, certainly fear and panic would be your first reaction. That would be especially true if the voice you heard pleading for help sounded exactly like your loved one!

And that is why so many people do end up sending money to scammers. The con artists know how to use fear, threats and more to fool their victims, whether they say:

  • “We have your daughter and you need to pay us to keep her safe!”
  • “The police will arrest you today unless you pay your back taxes now.”
  • “You need to send us cash not if you want to stop fraud on your bank account.”

After that initial panic…look for the red flags that tip you off to voice scam.

So, let’s agree that there if you get a family emergency call, it is a quite likely that your initial reaction—physical, visceral or emotional—might include some fear or panic.

It is quite likely that…

  • Your blood might run cold.
  • Your heart might race, your mind will run a mile a minute.
  • You’ll have chills running down your back.
  • You won’t be able to catch your breath.

And if any of those happen initially, remember that you simply have natural human emotions.

Let that sink in…and let that initial reaction it pass. Because if you know what to look for, you can out-maneuver the scammers.

Awareness of the red flags is the key to avowing a trap.

Levine says once you’re aware of the signs of a scam, composing yourself and remaining calm will help you think clearly enough to make the right next moves.

  1. If you get one of these family emergency calls (which in itself is the biggest red flag), simply hang up. 
  2. Better still, don’t pick up call from a number you don’t recognize.
  3. If you get a frightening or threatening call, remain calm and contact the authorities.
  4. Try not to let yourself be fooled simply because you hear the voice of a loved one. Keep the voice-aided AI imposter scams in mind.
  5. Create a code word for your immediate family. Talk to them about how you’d use it—remembering that most likely (and hopefully) you never will.

What’s probably the biggest red flag of nearly all scams?

If the caller demands that you…

  • Make wire transfer.
  • Buy hundreds or thousands of dollars in gift cards.
  • Say you should by cryptocurrency to settle a debt or make a payment.

Scams are here to stay. Scams and AI are headed your way.

As artificial intelligence gets more refined and more available to the public—which includes scammers everywhere—the ease of knowing the difference between what’s real and what’s fake will only get more difficult.

Moreover, the problem is going to continue to evolve and grow.

One way you can stay informed is to follow the Easy Prey podcast, hosted by Chris Parker, President and CEO of

He interviews guests on fascinating subjects, including AI and the impact it has on our world.

Related Articles

  • All
  • Easy Prey Podcast
  • General Topics
  • Home Computing
  • IP Addresses
  • Networking Basics: Learn How Networks Work
  • Online Privacy
  • Online Safety
  • Uncategorized
Popular VPN services that typically offer the option to choose your location

Top VPNs That Let You Choose Your Location

Did you know that all VPNs change your location, and many of them let you choose your…

[Read More]
Torrent download using CyberGhost VPN

Should I Use CyberGhost VPN for Torrenting?

The chaotic sea of the Internet offers so much content and interactive opportunities, it can feel overwhelming….

[Read More]
If you get caught in a scam, don't panic - follow these steps for what to do if you fell for a scam.

What to Do If You Fell for a Scam: A Comprehensive Guide

So you fell for a scam. It can happen to anybody. No matter how smart, capable, aware,…

[Read More]
Craig Davies talks about the challenges of onboarding and offboarding employees in a smooth and secure way.

Making the Employee Onboarding and Offobarding Process Easier and More Secure

Fifty years ago, it wasn’t very common for employees to change jobs. Once they were hired, they…

[Read More]
Ensure the VPN service is compatible with your PlayStation device.

How to Choose the Right VPN for PlayStation

Adults and kids alike enjoy hours submersed in the adventures of online gaming. You can network and…

[Read More]
Verify if online sources are legit

8 Ways To Tell if an Online Source is Legit

In the digital age, the amount of information at our fingertips is staggering. Between newspaper and magazine…

[Read More]