advertisement
advertisement
advertisement

Criminals are using deepfakes to impersonate CEOs

Criminals are using deepfakes to impersonate CEOs
[Photo: Elijah O’Donnell/Unsplash]

“Deepfakes” refers to media that has been altered by artificial intelligence to make it appear that a person is doing or saying something that, in fact, that person has never done or said. The technology first began appearing a few years ago, with crude deepfake tools allowing users to make it look like celebrities were recorded engaging in sexual activities they actually didn’t take part in.

But deepfakes are now moving past the porn realm and into the criminal world where bad actors are using the tech to impersonate CEOs, Axios reports. However, for now, it appears criminals are using deepfake audio instead of video to pull off scams:

  • Symantec, a major cybersecurity company, says it has seen three successful audio attacks on private companies. In each, a company’s “CEO” called a senior financial officer to request an urgent money transfer.

  • Scammers were mimicking the CEOs’ voices with an AI program that had been trained on hours of their speech—culled from earnings calls, YouTube videos, TED talks, and the like.

  • Millions of dollars were stolen from each company, whose names were not revealed. The attacks were first reported in the BBC.

The threat deepfake audio poses to businesses cannot be understated. While someone using deepfake audio to pretend they’re the CEO of a company and getting that company’s accounting department to wire them $1 million because of an “emergency” is one thing, the tech could also be used for sabotage. What if one rival–or even a nation-state–wanted to sink Apple’s stock price? A well-timed deepfake audio clip that purports to show Tim Cook having a private conversation with someone about iPhone sales tanking could do just that–wiping billions off the stock market in seconds.

And unfortunately, right now there just aren’t reliable tools to easily and automatically identify deepfake media on the web. By the time a deepfake video or audio recording has been debunked, the damage could already be done.

If you want to see a deepfake in action, check out the one of President Obama, voiced by Jordan Peel, below.

advertisement
advertisement