Scammers

Scammers Trick Company Employee Using Video Call Filled with Deepfakes of Execs, Steal $25 Million

In a sophisticated cybercrime incident, scammers employed deepfake technology to orchestrate a $25 million theft from the Hong Kong branch of an undisclosed multinational company. The criminals utilized digitally recreated versions of high-ranking company executives, including the Chief Financial Officer, during a video conference call that deceived an unsuspecting employee.

The victim, the only real participant in the video call, followed instructions given during the meeting, transferring HK$200 million ($25.6 million) across 15 transactions to various Hong Kong bank accounts. The scam was only discovered a week later when the employee, suspecting foul play, contacted the company headquarters.

Despite initial suspicions triggered by an email mentioning a secret transaction, the employee was convinced of the call’s authenticity due to the accurate recreations of their colleagues using deepfake technology. The scammers imitated the voices of their targets by employing deepfake techniques, mimicking them reading from a script.

Acting senior superintendent Baron Chan Shun-ching highlighted the use of deepfake technology to imitate the voices of the targets during scripted interactions. While other instances of deepfake-related crimes often involve one-on-one video calls with a fake persona, this incident relied on a group video conference.

The deepfake representations gave specific instructions to carry out the fraudulent money transfers, concluding the meeting abruptly. Following this, the scammers maintained contact with the victim through instant messaging platforms, emails, and one-on-one video calls.

Police are currently investigating the incident, with reports suggesting that two or three additional workers at the branch were targeted using similar deepfake video conference tactics. However, only one employee fell victim to the scam. No arrests have been made as the investigation unfolds.

This incident adds to the growing concerns surrounding deepfake technology, prompting discussions at both Congressional and White House levels. Last week, explicit fake images of Taylor Swift circulated on social media platform X, leading to increased scrutiny on addressing the deepfake problem. In June, the FBI issued a warning about sextortionists creating explicit deepfakes using individuals’ social media images.

Oh hi there šŸ‘‹
Itā€™s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We donā€™t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *