Picture this: You are the CEO of a German energy company. You receive a phone call from the CEO of your parent company, based in Germany, telling you to wire $200K to a new Hungarian supplier. You are told that the matter is urgent and to send the money within the hour. You send the money and continue with your day. A few hours later, you receive another request to wire more money. You call your boss only to receive another phone call at the same time from… your boss?
This situation may sound like something out of a poorly directed Lifetime film, but it’s actually the plot to a real article written by the Wall Street Journal on how deepfake technology was used to dupe a CEO into wiring a significant sum of money across international borders.
“The software was able to imitate the voice and not only the voice: the tonality, the punctuation, the German accent,” a Euler Hermes spokesperson told the Washington Post. According to The Verge, the phone call was linked to a company email, the money then routed through accounts in Mexico and Hungary before disappearing.
Now, imagine a similar scenario, but this time, it’s a video conference where a client is asking about sensitive details regarding an upcoming product launch. To the untrained eye, the facial features, speech patterns and minor shifts in the body language of a deepfake could be virtually undetectable to disastrous effect.