Meta’s voice replicating AI is a dangerous security loophole


The new technology opens the door to phishing scams, misinformation, and the audio equivalent of deepfake pornography.

Meta, formerly known as Facebook, has introduced an innovative text-to-speech AI that boasts the ability to edit existing audio, converse in six languages, and, perhaps more eerily, mimic the voices of your close ones. 

Imagine, you can now have a conversation with your aunt without the hassle of an hour-long phone call. Or clone the voice of your governor giving orders around. Or, Lord forbid it – act as if you were President Vladimir Putin!

In a recent press release, Meta stated, "…We're announcing a breakthrough in generative AI for speech. We've developed Voicebox, a state-of-the-art AI model that can perform speech generation tasks — like editing, sampling and stylizing — that it wasn't specifically trained to do through in-context learning." 

In essence, all it takes to replicate someone's voice is a mere two-second audio snippet.

Voicebox will subsequently "emulate the audio style," and voilà! With just a written prompt and a few clicks, you can generate an AI-powered replica of your friend or family member's voice.

Deepfake buddies

To be fair, Meta presents a compelling case for this specific functionality, asserting that the technology could "enable visually impaired individuals to hear written messages from friends in their own voices."

Advancing accessibility in technology is crucial, and this application could indeed be valuable. Nevertheless, the idea of imitating someone’s voice remains disconcerting, with substantial potential for misuse. After all, if a friend's voice can be replicated with only a two-second audio clip, practically anyone's voice could be emulated given the right audio sample.

More to read:
Norway bans Meta’s behavioral advertising on Facebook and Instagram

This potential security loophole opens the door to phishing scams, misinformation, and the audio equivalent of deepfake pornography. The ethical and legal implications are undeniable.

Fortunately, Meta is well aware of these risks and has promised to keep both the model and its underlying code closed-source for the time being. 

Voicebox, which is now in beta version, clearly has the substantial potential for abuse. On the other hand, aren’t you tempted to hearing the voice of friends who passed away or a pop star “confessing” about her love to you?



Is citizenship withdrawal a justified measure against unloyal citizens?

View all
YES
NO