Site icon News Bit

A Deepfake Phone Call Dupes An Employee Into Giving Away $35 Million

Opinions expressed by Entrepreneur contributors are their own.

“Hi Susan, it’s Gene. Sorry for calling after hours, but I’m travelling. Can you please transfer $35,000 from our business checking account to a new supplier for a deposit on a job? Here’s their banking info…”

Does this sound like a familiar scenario? It should. It’s not uncommon for the owner of a business to call a financial manager and instruct for a money transfer or online payment to be made to a supplier or to a personal account. Is anyone going to question the boss’ request? Usually not.

But what if it’s not the boss? What if it was just someone impersonating the boss? Or, more ominously, what if it was the actual voice of the boss, but manipulated into saying something different? Or that the request was for $35 million?

This is exactly what happened in early 2020 to a Hong Kong bank.

Related: Why Are So Many People Still Eating Spam?

According to a report in Forbes, a manager at the bank got a call from one of the bank’s directors requesting that he make a transfer of $35 million in order to fund an acquisition. However, it wasn’t the director calling. It was a “deepfake” of the director’s voice. And by the time the bank discovered the error, the money was long gone.

Oh, and this isn’t the first time something like this has happened. Forbes also reported that an energy company in the UK fell for a similar ruse in 2019 and lost about $243,000.

“Audio and visual deep fakes represent the fascinating development of 21st century technology, yet they are also potentially incredibly dangerous posing a huge threat to data, money and businesses,” Jake Moore, a cybersecurity expert, told Forbes. “We are currently on the cusp of malicious actors shifting expertise and resources into using the latest technology to manipulate people who are innocently unaware of the realms of deep fake technology and even their existence.”

What’s even more terrifying is that deep fake technology is easily found online. Just go to sites like Resemble or Descript and then check out how amateur pranksters are creating videos like these that show just how easily we can be fooled into thinking something that we see (and hear) is real, even when it’s not. Now that it’s out there, this technology is increasingly being used for blackmail, fraud and identity theft. And it’s likely that audio will be more commonly used than video because, according to Moore, manipulating audio is “easier to orchestrate than making deep fake videos.”

You may think that your business is too small to be impacted, but I don’t think so. That’s because if you’re like most of my clients, you have fewer financial controls than larger organizations and you’re probably increasing your use of online services to pay your bills. And getting a copy of your voice is easy, particularly if you’ve posted company videos on your website, did a public presentation, appeared in the media or got chatty with a “sales representative” on a cold call that’s being recorded without your knowledge. With only a few hours of work, someone can likely dupe your financial manager out of tens of thousands and be gone before you know it.

Related: Elon Musk Is An Awful Speaker. But Keep Listening.

So what to do? Tighten up your internal controls. Require more than two authorizations for any bank transfers or payments and perhaps three (and at the very least your own) for disbursements over a certain amount, like $5,000. Hire your IT firm or subscribe to tools like KnowBe4 or Mimecast to provide ongoing training for your employees so that they can spot warning sign. (In the case of the Hong Kong bank, fraudulent emails were also sent confirming the deepfake phone call.) Abolish any transactions of a certain size that are authorized by phone unless the person making the request has been called back. Involve your financial managers in large deals early so that they’re more aware of the dollars involved. Because let’s face it: This problem is only going to get worse.

“Manipulating audio, which is easier to orchestrate than making deep fake videos, is only going to increase in volume,” Moore told Forbes. “And without the education and awareness of this new type of attack vector, along with better authentication methods, more businesses are likely to fall victim to very convincing conversations.”

Kilito Chan | Getty Images

For all the latest Business News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@newsbit.us. The content will be deleted within 24 hours.
Exit mobile version