As a leader in the financial service industry, cybersecurity can sometimes feel like a distant concern. We get it. You think of cyber attacks as something that happens to others, not you, right? However, technology has since evolved and cybercriminals have grown smarter and more sophisticated through the years, making all individuals and businesses vulnerable to cyber attacks. And that includes you.
Earlier this year, Clive Kabatznik, an investor in Florida, called his local Bank of America to discuss a huge money transfer he was planning to make. Right away after this call, a cyber criminal called back the bank using an AI-generated deepfake voice of “Clive” to convince the bank's agent to transfer the money to another account. Fortunately, the agent was suspicious enough that no money was transferred, but not everyone was as lucky.
Over in the UK, the CEO of an energy firm got hit by a wild voice scam. He believed he was on the line with his big boss from the parent company over in Germany. The voice on the other end told him to send $233,000 to some Hungarian supplier. The CEO did not suspect anything off with the call since the voice was an exact copycat. No second-guessing, no hesitation. You'd think it's all good, right? Nope. By the time they realized what had happened, the money had already been transferred to Mexico and distributed to other locations that weren’t traceable.
Advancements in generative AI (artificial intelligence) are allowing scammers to produce deep fakes to conduct these kinds of frauds. As the technology gets better, it becomes harder for leaders in the financial services sector to detect fraudulent calls such as the examples given above. The CEO of Pindrop, a security company that monitors audio traffic for many of the largest US banks, said he had seen a jump in its prevalence this year – and in the sophistication of scammers’ voice-fraud attempts. Another large voice authentication vendor, Nuance, saw its first successful deepfake attack on a financial services client late last year.
In a report by McAfee, a whopping 77% of AI voice scams successfully swindled money from their targets. The game is indeed evolving, and scammers are getting smarter. They can clone a voice with just three seconds of audio. Yeah, you read that right – three seconds.
This isn't a drill. With AI tech advancing and voice recordings flooding social media platforms, the perfect storm for voice-related scams is brewing.
So, how do you protect yourself and your business?
First things first, share this article with your team. Make them aware. Then, set a golden rule: ALWAYS check with you via text or other means before making money transfers. And hey, if you're not the boss, do the same with your family, establish a code word – whatever it takes to verify the caller's legitimacy.
Check that caller ID. If it's fishy or blocked, hang up. Don't engage. Call them directly or contact the place they claim to be (school, office, etc.).
If the urgency is sky-high, and they're demanding money ASAP via wire transfer or Bitcoin – major red flag. Real emergencies don't come with sketchy payment demands.
In the business world, you've fought tooth and nail to get where you are. Cyber threats are lurking everywhere, and being nonchalant about cybersecurity is a one-way ticket to getting robbed.
Don't let that be your story. Click here to request a free Cyber Security Risk Assessment. It's quick, confidential, and obligation-free. Voice scams are just the tip of the iceberg, and our assessment can show you where you stand against cyber predators.
Don't be the easy target. Claim your complimentary Risk Assessment today. Your cybersecurity is worth it.
You must be logged in to post a comment.