A catchy country tune isn’t usually the start of a conversation about bank fraud. But this week an AI-generated track called “Walk My Walk” changed that in a hurry.
The song came from a virtual artist named Breaking Rust. No singer stood behind a microphone. AI built the voice and the music from scratch. What shocked people most wasn’t the technology. It was the success. The track hit No. 1 on Billboard’s Country Digital Song Sales chart and racked up more than 3 million Spotify streams in under a month.
Country musicians chimed in right away. Some said AI can write a melody, but it can’t stand in front of a crowd and pour its heart out. Others said AI might help with the creative process, but it shouldn’t replace human writers or performers.
So why does a bank care about a chart-topping AI country song?
Because the same technology that can build a hit can also build a fake voice, and scammers use it every day. That’s where audio deepfake scams come in.
What AI Can Do with a Voice and Why It Matters
A deepfake is a piece of audio or video that looks or sounds real but comes from AI. When scammers create audio deepfakes, they clone someone’s voice. And they don’t need much to do it. Sometimes a few seconds of audio from a social media post, podcast clip, or voicemail is all it takes.
Once scammers have that voice, they can build sentences the real person never said. They can make the voice sound scared, calm, angry, or confused. They can call at odd hours. They can layer in background noise to make the situation feel urgent.
AI can write a song. It can also create a fake phone call that feels personal and real enough to fool anyone, even people who think they’re too cautious to fall for fraud.
That’s why audio deepfake scams are rising fast. Law enforcement agencies and regulators now warn banks and consumers about these tools because fraudsters use them to impersonate family members, business leaders, or even bank employees..
How Audio Deepfake Scams Target Your Money
Criminals always follow the easiest path to your money. Audio deepfakes give them a shortcut. Here are the most common ways these scams work.
1. Fake Family Emergencies
You’ve probably heard of the “grandparent scam.” A caller claims to be a grandchild who needs money right away.
Now imagine that same scam, but with a voice that sounds exactly like your grandchild — shaky, scared, and begging for help. The scammer might know details from social media, like if they’re travelling or just moved and need to borrow money.
Once you panic, your guard drops. That’s when the scammer pushes you to wire money, send a transfer through online banking, or withdraw cash.
Speed is their weapon. Emotion is the fuel.
But the good news is that there are ways to fight back. Read our article on fake family emergency scams here.
2. Fake Calls from Your Bank
This scam is on the rise. The caller ID might show a number that looks like your bank. The voice might sound like a banker you spoke with last month.
With AI voice cloning, scammers can copy a real person’s tone and pace. They might say:
- “We detected fraud. Read me the one-time passcode we just sent you.”
- “Your funds are at risk. Move everything to this ‘safe account.’”
- “We need your login or debit card details to verify activity.”
These are audio deepfake scams, even if the message sounds routine.
If you ever get a call like this, hang up. Then call the bank directly at the phone number on your debit card or the online banking app.
3. Attacks on Voice Authentication
Several major services use voiceprints to verify identity during phone calls, but now those are at risk. Some companies once used “my voice is my password.” That approach no longer stands on its own.
For example, 1st Source’s head of security recently prank-called our CEO. He spoofed the voice and phone number of one of our top executives, sharing the encounter with the rest of our team to prove the point.
AI can fool both humans and voice authentication systems. That’s why 1st Source uses multiple layers of security, like one-time codes, secure apps, and real-time monitoring of account activity. A voice can be faked. A layered system is much harder to break.
Why Everyday Customers Face the Most Risk
Many people think deepfake scams only target celebrities or corporate leaders. The truth is simpler: the people most at risk are everyday consumers.
Why? Because we share more audio online than ever before. Family videos, graduation clips, church programs, or birthday toasts all give scammers the raw material they need. And they don’t need studio-quality sound.
Scammers use phishing emails, fake text alerts, and now audio deepfake scams to slip through those doors if you’re not careful.
We invest in strong security, fraud monitoring, and employee training. But the most powerful tool is knowing the warning signs.
That’s why we’ve put together a comprehensive guide with over 50 articles about spotting and avoiding scams.
How to Protect Yourself From Audio Deepfake Scams
The good news? You don’t need a tech degree to stay safe. You just need a few smart habits.
Don’t trust caller ID or a familiar voice
Both can be faked. Treat any surprise call about money, passwords, or urgent trouble as a red flag.
Pause and ask yourself:
- Did I expect this call
- Did I start this conversation
- Is the caller asking for info they should already know
If something feels off, hang up.
Use the “pause and verify” approach
It’s simple and it works.
- Hang up.
- Call back using a trusted number from your card or the bank’s website.
- Visit online or mobile banking by typing the web address yourself.
- Never share one-time passcodes, passwords, or full account numbers with someone who called you.
One pause can stop a scam in its tracks.
Set a family code word
Pick a phrase only your close family knows. If someone calls about an emergency, ask for the code word. A real family member knows it. A scammer never will.
Watch for pressure or secrecy
Scammers push fast:
- “Do this right now.”
- “Tell no one.”
- “You’ll lose everything if you wait.”
Real banks never rush you like that. We want you to slow down, ask questions, and feel safe.
Learn the signs of imposter scams
Audio deepfakes fall under a bigger family called imposter scams, when someone pretends to be your bank, a government office, a company, or even a friend.
You can learn more about these tricks on our page about imposter and spoofing scams.
Knowing the signs helps you spot trouble before it reaches your accounts.
Staying Safe in a World of Synthetic Voices
The rise of an AI-generated country hit shows how far technology has come. At the same time, audio deepfake scams show how far criminals will go to take advantage of it.
At 1st Source Bank, we track new threats, update our systems, and work with law enforcement to protect our clients. We also believe clear, simple education gives you the confidence to protect your own money.
You don’t need to fear every call. You just need to stay alert, trust your instincts, and verify anything that feels off.
If a call, text, or email makes you uneasy, stop and check. Reach us through the phone number on our website, or visit a nearby banking center. We’re here to help you protect your money and your peace of mind.
