These New AI Scams Are On The Rise, Here's How To Spot Them Before It's Too Late

Photo of author
Written By Editor

Who keeps posting articles without emotional mental changes

robot holding smartphone

This year has witnessed an unprecedented pace of innovation in the field of AI, but all of those advancements have also added a new dimension to cybercrimes. The latest weapon of bad actors is deepfaked audio or visual media, which is usually generated using recording samples and then exploited for extortion and other such frauds. In some cases, the scammers use AI to clone the voice of someone close to the victim, who doesn't realize they're speaking to a scammer.

The latest example of this involves an 82-year-old Texan named Jerry who was conned by a criminal posing as a sergeant with the San Antonio Police Department, according to ABC13. The scammer told the victim that his son-in-law had been arrested and that Jerry needed to send $9,500 as bail money to get him out. As well, the scammer convinced Jerry to pay an additional charge of $7,500 to complete the whole process. So far, the perpetrators remain unknown while the victim, who resides at an assisted living facility, is left contemplating taking a job to recuperate the money they lost.

AI crimes are getting more sophisticated

illustratin digital criminal

This wasn't the first time an AI tool was used to commit fraud. Earlier this year, a Chinese man was robbed of over half a million dollars after a cybercriminal used an AI face-swapping tool to impersonate the victim's friend, persuading him to transfer the money, according to Reuters.

CBS News also reported on the rise of audio AI tools that can clone someone's voice, which is then used to pose as a person in distress. CNN covered a similar incident where scammers tried to trick a mother into believing that her daughter had been kidnapped using the latter's fake voice. In another case reported by Insider, a father was told over the phone that his son was in a serious accident. 

Criminals often use such situations and send fake media to loved ones in order to extract money under the claim that it is needed for emergency assistance. This is a modern take on imposter scams, which are nothing new. In a February 2023 report, the FTC said that American citizens lost nearly $2.6 billion to this kind of fraud in 2022. However, the advent of generative AI has raised the stakes dramatically. 

How to protect yourself from AI scammers

scam phone call

Adding fuel to the fire of AI-assisted digital crimes is Big Tech. The likes of Meta and Microsoft have created AI models that can replicate a person's voice with a minimal number of voice samples. "With a few seconds of audio taken from an Instagram Live video, a TikTok post, or even a voice note, fraudsters can create a believable clone that can be manipulated to suit their needs," says McAfee [PDF]. So how can you protect yourself from these scams?

The most foolproof solution, aside from ignoring calls from unknown numbers, is to establish a codeword with loved ones so that you can discern whether it's really them on the other end of the phone. As well, you can try reaching out to them directly to verify whether they truly are in a challenging situation. If you get a call seemingly from a friend or loved one experiencing an emergency, try asking them personal questions that a scammer wouldn't normally know. Experts also suggest that when faced with uncertain information, you should hang up and call the person directly or verify the information before responding. 

Unfortunately, voice cloning isn't the only type of AI-based attack used by scammers. A related domain is extortion using deepfaked content. Bad actors have tried on numerous occasions in the recent past to blackmail victims using AI-generated explicit imagery. The Washington Post has reported multiple incidents in which deepfakes have ruined the lives of many teenagers. In such a situation, rather than taking matters into one's own hands, it is advised to contact law enforcement as soon as possible.

Leave a Comment