Attention FAE Customers:
Please be aware that NASBA credits are awarded based on whether the events are webcast or in-person, as well as on the number of CPE credits.
Please check the event registration page to see if NASBA credits are being awarded for the programs you select.

AI-Generated Voice Deepfakes Become New Tool for Fraudsters

S.J. Steinhardt
Published Date:
Aug 30, 2023

Scammers have found a way to use artificial intelligence (AI)-generated voice deepfakes to trick people out of their money, The New York Times reported.

Voice deepfakes are vocal renditions that mimic real people’s voices. Pindrop, a company that monitors the audio traffic for many of the largest U.S. banks, has seen a jump in its prevalence this year, and in the sophistication of scammers’ voice fraud attempts, a source told the Times. A customer of Nuance, a voice authentication vendor, was the victim of a deepfake attack late last year.

Such voice-related AI scams have been enabled by the speed of technological development, the falling costs of generative AI programs and the wide availability of recordings of people’s voices on the internet, according to the Times. Hackers’ thefts of wealthy clients’ bank account details have become even easier, as these individuals’ public appearances, including speeches, are often widely available on the internet. Finding audio samples for everyday customers can also be as easy as conducting an online search.

“There’s a lot of audio content out there,” Vijay Balasubramaniyan, the CEO and a founder of Pindrop, told the Times. His company reviews automatic voice-verification systems for eight of the 10 largest U.S. lenders. Most of the fake voice attacks that the company has seen have come into credit card service call centers, where human representatives deal with customers needing help with their cards, he said.

Brett Beranek, the general manager of security and biometrics at Nuance, told the Times that his biggest concern is not attacks on call centers or automated systems, such as the voice biometrics systems that many banks have deployed, but the scams in which a caller reaches an individual directly. “I had a conversation just earlier this week with one of our customers,” he said. “They were saying, hey, Brett, it’s great that we have our contact center secured—but what if somebody just calls our CEO directly on their cellphone and pretends to be somebody else?”

Data breaches that reveal the personal information of bank customers, a basic cybersecurity threat that has been around for decades, are boons to these sophisticated attacks. From 2020 to 2022, bits of personal data on more than 300 million people fell into the hands of hackers, leading to $8.8 billion in losses, the Federal Trade Commission (FTC) reported earlier this year. Of that amount of lost money, the most—more than $3.8 billion—came from investment scams.

One near-victim of a voice deepfake was Florida investor Clive Kabatznik, whose voice was artificially generated by a software program and used to try to trick his banker into making a money transfer. This artificial generation was enabled by online videos of him speaking at a conference and participating in a fund-raiser. Fortunately,  the fraud was detectable because the voice was repetitive and used garbled phrases.

“I think it’s pretty scary,” Kabatznik told the Times. “The problem is, I don’t know what you do about it. Do you just go underground and disappear?”

Click here to see more of the latest news from the NYSSCPA.