A Vietnamese cybersecurity expert has recently introduced a simple yet effective technique to identify deepfake fraudsters amid a rising number of video call scams in Vietnam.
Ngo Minh Hieu, also known as Hieu PC, a renowned cybersecurity specialist at the Vietnam National Cybersecurity Center, suggested a straightforward test: ask the suspicious caller to show their teeth.
Hieu PC, who launched the Chongluadao (anti-scam) initiative in 2021, explained that deepfake AI struggles to replicate human teeth accurately in real-time video calls.
The 36-year-old expert said that when asked to open their mouth, a deepfake impersonator may display unnatural dental features, such as missing or distorted teeth, a clear indicator of deception.
"If it is a deepfake when they open their mouth, some might have no teeth at all, while others could have three or even four sets of teeth," he elaborated.
"That’s why teeth are an easy way to identify a fake video call."
Hieu PC further advised requesting the caller to turn their head left or right, stand up, or perform spontaneous movements if fraud is suspected.
These actions disrupt deepfake rendering, exposing inconsistencies that signal a fraudulent call.
This advice comes amid a sharp increase in deepfake scams in Vietnam. Cybercriminals use AI to impersonate individuals on video calls, often to solicit emergency funds or extract sensitive information.
Authorities have recently reported receiving two to three complaints per month. Many cases involve young people and students befriended on social media, coerced into compromising situations during video calls, and subsequently blackmailed.
Hieu PC highlighted two major red flags during such calls: unsolicited money transfer requests and prompts to click on unfamiliar links.
He emphasizes that in these scenarios, it is best to terminate the call immediately.
He reported that a common scam is faking the voice and face of a victim's relatives over the phone or FaceTime to trick them into transferring money.
"They may call from an unknown number, using a voice identical to a friend’s, and urgently ask for money," he said.
Even more concerning, hackers use AI to manipulate explicit videos by inserting the victim’s face, then send blackmail emails threatening to release the footage unless a ransom is paid.
For suspicious FaceTime calls, it is advised to verify the caller’s identity by returning the call using a known number and asking personal questions that only the genuine individual would know.
By remaining vigilant and using these simple verification techniques, individuals can better protect themselves from deepfake video call scams.
As deepfake technology evolves, Hieu PC urged the public to stay informed and cautious. Raising awareness and practicing these detection methods can help mitigate the risks associated with AI-driven fraud.
Authorities also encourage people to report suspected scams to cybersecurity agencies to prevent further incidents.
By adopting these measures, individuals can safeguard their privacy and financial security against the growing threat of deepfakes, ensuring a safer digital environment for everyone.
Like us on Facebook or follow us on Twitter to get the latest news about Vietnam!
Max: 1500 characters
There are no comments yet. Be the first to comment.