Kerala Man Loses Rs 40,000 in Sophisticated AI-Based ‘Deepfake Scam’

Kerala Man Loses Rs 40000 in Sophisticated AI-Based 'Deepfake Scam'

Day by day, as the digital world continues to expand, cyberscams and frauds are also keeping pace with technology and thus rising. And with artificial intelligence (AI) being the latest craze in the tech world, fraudsters are funding new ways to use it to their benefit.

One such case has recently come to light. PS Radhakrishnan, a resident of Kozhikode in Kerala, lost Rs 40,000 after cybercriminals used deep fake AI technology and posed as a former colleague in a WhatsApp video call and sought money for his sister’s surgery.

Radhakrishnan first received a call from an anonymous number, which he ignored. Then later, he found several messages from the same number on his WhatsApp, with the person identifying himself as his former colleague at Coal India Ltd.

“We had worked together for nearly four decades and I knew him well. The display picture was his photo. He asked about my daughter and where she worked. We texted for some time during which he shared his family photographs and asked about our common colleagues,” Radhakrishnan said, as per the HT report.

After some time, the victim received a voice call from the same number. The caller said he was currently at the Dubai airport, waiting to board a flight to India. He asked for a financial favour as his sister-in-law was scheduled for an emergency surgery at a hospital in Mumbai, and a sum of ₹40,000 had to be paid urgently as advance. He said the money had to be transferred via UPI to the phone of someone with her at the Mumbai hospital.

The retired official wanted to be doubly sure that he was not being taken for a ride. Immediately, the man said that he would video call right away.

Fraudster Impersonated The Face Of Victim’s League
“Seconds later, he called and looked exactly like my former colleague. Even though only his face was visible, it was clear. His lips and eyes moved like any normal person as we talked in English. The call lasted just 25 seconds before it got cut. He later came back on a voice call and spoke about the urgency for money. I didn’t ask any more questions and transferred the money,” the report mentioned.

A few minutes later, the same man called again and asked for Rs 35,000 to take care of hospital expenses. The Kerala man grew suspicious, saying, “There was a hurried tone in his voice, and I lied that my account didn’t have sufficient balance.”

Radhakrishnan then called his former colleague on the number he had saved earlier in his contact list, and his former colleague said he never called him. He then realised that he had been cheated by a scamster, who, the police believe, was involved in a deep-fake scam involving someone who the man knew.

How Do Fake Scams Happen?
AI-based deep fake calls are a type of scam that uses artificial intelligence to create fake videos or audio recordings of people. Scammers can create these calls impersonating a trusted friend, family member, or colleague to trick the victim into giving away personal information or money and thus getting scammed.

Deepfakes can be seen as the 21st century’s answer to photoshopping. Deepfakes use a form of artificial intelligence called deep learning to make images of fake events, hence the name deepfakes, as per The Guardian report.

Deepfake technology can create convincing but entirely fictional photos and videos from scratch. Even audios can be deepfaked too, to create “voice skins” or “voice clones” of anyone.

NutShell
The ‘Deepfake Scam’ is an AI-based fraud in which criminals use artificial intelligence technology to create fake audio or video recordings of individuals, often impersonating someone known to the victim.

In this particular case, the fraudsters used AI-generated voice technology to create a fake audio recording of someone the victim knew, such as a family member, friend, or colleague. They then used this fake recording to deceive the victim into transferring money or sharing sensitive information, believing it was a genuine request.

The technology used in deepfake scams has become increasingly sophisticated, making it challenging for individuals to distinguish between real and fake audio or video content. As a result, people may unknowingly fall victim to such scams, leading to financial losses and potential security risks.

To avoid falling prey to deepfake scams and similar fraudulent activities, individuals should exercise caution when receiving requests for money or personal information, especially if they come through unusual channels or seem out of character for the person making the request. It is essential to verify the authenticity of such requests through other means of communication, such as a phone call or in-person confirmation, before taking any action. Additionally, staying updated on the latest cybersecurity threats and raising awareness about deepfake scams can help protect oneself and others from falling victim to such fraud.

Deepfake scams are becoming increasingly common, as the technology to create fake videos and audio recordings becomes more sophisticated. It is important to be aware of these scams and to take steps to protect yourself. Here are some tips:

  • Be suspicious of any unsolicited messages, especially those that ask for money.
  • Do not click on links in messages from people you do not know.
  • Verify the identity of the person you are talking to by calling them on their phone number or meeting them in person.
  • Be careful about what personal information you share online.

If you think you have been the victim of a deepfake scam, you should report it to the police. You should also contact your bank to cancel any unauthorized transactions.

Related posts