Carelessly sharing personal photos and friend lists on social platforms like Facebook and TikTok is unintentionally aiding online scammers, especially as deepfake technology becomes increasingly advanced.

W-DSCF8100.JPG.jpg
Cybersecurity expert Ngo Minh Hieu discusses security solutions in the age of AI on May 19. Photo: Event organizer

Artificial intelligence (AI) is ushering in unprecedented change - both beneficial and harmful. For criminals, AI tools are simplifying the execution of illegal online activities.

Cheap, easily accessible software sold across the dark web enables even non-technical individuals to launch cyberattacks, scams, and blackmail operations.

The misuse of AI to create fake images, spoof websites, distribute malware, and generate ransomware is on the rise. Criminals also use AI to produce toxic content for online dissemination, often for attention or defamation.

With AI, cybercriminals can execute schemes more efficiently, without deep technical expertise. This insight was shared by cybersecurity expert Ngo Minh Hieu, Director of the Anti-Scam Project, during a conference held on May 19.

Hieu cited a United Nations-supported study revealing that criminal groups can purchase deepfake software for as little as $25-30 USD to impersonate law enforcement and other authorities in fraud schemes.

One group in Cambodia has developed a tool so advanced that once a script and list of Facebook accounts are entered, it can automatically engage with potential victims. If a target responds, the system follows the programmed script. This allows scammers to deceive thousands of people daily.

According to Hieu, deepfake or deepvoice software only requires one image and a 20- to 30-second audio clip to generate highly convincing fake videos.

He cited cases involving losses of tens of millions of dollars due to deepfakes. For instance, an employee at a multinational company in Hong Kong was tricked into transferring $25 million USD to a fraudster impersonating the CFO in a video meeting.

The deepfake scam wave is expected to worsen as the technology becomes more sophisticated. Yet, internet users are unwittingly facilitating impersonation by oversharing their personal lives online.

This issue stems from the widespread habit of publicly posting photos and friend lists on social networks like Facebook, TikTok, and Instagram.

Fraudsters often initiate contact with targets, take screenshots of their photos, and later manipulate the images to impersonate them or even fabricate sexually explicit content for extortion.

Hieu advised users to restrict photo visibility to friends only and to set friend lists to “only me” to minimize data leaks.

He also urged responsible use of AI and social media, recommending limited disclosure of personal information for safety. In suspected deepfake video calls, users should watch the caller’s lip movements or request physical actions like standing, sitting, or turning to confirm authenticity.

At a cybersecurity conference in June 2024, a representative from VNPT eKYC, an electronic identity verification platform, noted that cybercriminals are using AI to create deepfake images aimed at bypassing online authentication systems. This poses a significant challenge for service providers.

Among Vietnam's four main types of eKYC fraud, deepfakes are the most complex. Criminals analyze and replicate a person’s facial features and voice using AI to generate manipulated photos and videos that mimic new movements and gestures.

In response, VNPT has adopted a “fight fire with fire” strategy, integrating AI to counter deepfake threats. Their solutions include facial comparison, face recognition, document forgery detection, voice authentication, and anomaly detection through data analysis.

The Ministry of Public Security has also issued widespread alerts about deepfake-related video and image fraud. According to the ministry, these scams go beyond financial investment fraud and include romance scams.

In such cases, perpetrators create fictional personas to build trust via video calls, then request money under the pretense of emergencies, travel expenses, or loan payments.

Du Lam