Not sci-fi, but real life: Samsung introduced a voice-cloning feature for their new Galaxy S phones this week. Train the Bixby mobile assistant on your S23, S23+, or S23 Ultra phone, and it’ll successfully mimic you during calls. This duplication is sold as a way to answer calls whenever you’re unable to talk—you instead type answers and the phone reads them to the other party in your voice.
But there’s a flip side to this stuff of sci-fi dreams. AI-trained voice imitation can be also used for nefarious purposes, as Motherboard demonstrated the very same day as Samsung’s news. Using AI-generated imitations of his voice, writer Joseph Cox breezed through his bank’s automated voice verification system.
Experts have warned for years of voice authentication’s weaknesses; long before AI-based tools, they could be bypassed with recordings. But such scams often involved cold-calling a mark and getting them to say pertinent phrases—e.g., manipulating them to say “Yes” in response to a basic question. AI changes the game, since interaction isn’t necessary for the mimicry. Nor is much of your voice necessary to train these services. Just a few minutes is enough.
In the short term, the danger isn’t widespread yet. For this sort of trick to happen without your knowledge, your voice has to be publicly available. It also still requires some work. Motherboard initially failed to fool Lloyds Bank with output from ElevenLabs, the tool it used. Multiple tries and tweaks had to happen before the output passed as legitimate. And it would likely need to be combined with number spoofing or something like Samsung’s Bixby call feature if a financial or other sensitive service is monitoring for other signals that you’re you.
But that doesn’t mean you should rest easy. Motherboard’s experiment is a pointed example that biometric authentication should be treated with the same consideration as other forms of account or device protection. In this context, biometric data is a set of physical passwords—and like any password, if the information is accessible or otherwise insecure, your security is weakened.
Your voice can be recorded. Your face can be photographed. Your fingerprint can be used while you’re incapacitated. You don’t need to stop using these methods of safeguarding your privacy, but their very convenience means they’re weaker forms of defense. Think of them as the equivalent of short passwords. While they’ll deter some people, someone can bypass them more easily.
George Prentzas / Unsplash
The smart use of biometric authentication is to use it for lower-stake situations—a PC where you never stay logged into sensitive sites, a phone with important apps protected by a passcode or password, and the like. And if you can’t pair your voice or face with stronger measures? Disable it, if you can.
At the very least, if your bank uses voice verification, a second factor for authentication would be advisable. Because while the AI tools aren’t nailing impersonations off the bat just yet, they’ll continue to improve, increasing the potential threat. An ounce of prevention could be worth a pound of cure, especially if you like to stream or post videos.