AI is used by scammers to manipulate you through “social engineering”
Former NSA cybersecurity expert Evan Dornbush told Forbes that AI can help scammers create believable messages faster than ever. With AI, the cost of creating these attacks is much lower. Dornbush said, “AI is decreasing the costs for criminals, and the community needs novel ways to either decrease their payouts, increase their operating budgets, or both.”
While most of you think of fake videos when you hear about “Deepfakes,” the latest battle revolves around faked audio and its use to scam people. Imagine getting a phone call from your daughter that she is being kept against her will and will only be released if you pay a ransom. While scams like this have already succeeded without AI, imagine hearing the shaky, scared sound of your daughter, wife, or son, or even one of your parents telling you about their terrifying ordeal and asking you to pay whatever the bad guys demand. With AI, these calls can be created even if the relative you thought you just heard pleading for her life is safe at the movies or home.
How to prevent your family from getting scammed by a deepfake audio attack
Imagine that you are certain that the caller was one of your kids or another relative. You’d be on your way to the bank in no time. The FBI has already issued a public service announcement (service alert number I-120324-PSA) warning people about these attacks. And the G-men have a great idea to counter such attacks. The next time you speak to a loved one, create a code that no one else knows. Each person should have their own code that only they know. This way, if you get a call from your daughter, you can confirm if that’s really her voice coming out through your iPhone’s speakers or an AI-created deepfake audio.
The article back in December generated a comment from a reader who called the creation of a secret word “the most idiotic and cynical advice ever.” The author of the comment worries that in a time of stress, your relative might not remember the secret word and you would then ignore what might in reality be a real “ransom call.” While that might be a legitimate concern, you need to make the code word something that is easy to remember even when stressed.
If you don’t like the idea of creating a code word, listen to make sure that your relative’s voice doesn’t sound robotic and if certain phrases are repeated out of context, chances are you are listening to a deepfake audio.