A finance worker in Hong Kong hopped on a video call with, what appeared to be, multiple coworkers and his company’s Chief Financial Officer. Everyone looked and sounded real, but it turns out everyone on the call was a deepfake. The employee was scammed into paying out over $25 million to unknown fraudsters, according to CNN on Sunday.
“(In the) multi-person video conference, it turns out that everyone [he saw] was fake,” Hong Kong Police official Baron Chan Shun-ching told media outlets on Friday.
Hong Kong’s police department did not publicly identify the company or the worker in this case, but it’s one of the largest financial scams with deepfake technology to date.
The employee received several emails from his company’s CFO requesting him to wire transfer the $25.6 million. He worried that this was a phishing scam, but after the video call, was convinced the request was legitimate.
Law enforcement said the employee recognized several people on the call, who all looked and sounded like his colleagues. He only found out it was a scam when he checked in with the corporation’s head office later that week, according to CNN.
The meeting’s participants were digitally recreated using publicly available footage of the individuals, according to the South China Morning Post. Multiple employees were allegedly targeted at this company.
The Hong Kong police department noted this was one of many recent cases involving deepfake technology scams. The department said they’ve made six arrests in connection to these frauds. A senior inspector for Hong Kong’s police department recommended several ways to check whether a person is real. These include asking someone to move their head or asking questions to determine their authenticity. These awkward tactics, which would be especially uncomfortable asking your boss to do so, may become necessary in the era of deepfakes.
Deepfake technology has wreaked havoc in recent weeks, as AI technology has become so convincing that it’s hard to tell what’s real and what’s not. AI-generated, pornographic images of Taylor Swift went viral in January, while a deepfake President Biden told New Hampshire voters not to vote.