Main Menu

Search

Try searching for

Identity theft

Social security protection

Credit monitoring

Reputation management

Blog > Identity Theft > Voice Cloning: Don’t Get Duped
 December 28, 2020

Voice Cloning: Don’t Get Duped

Soundwave illustration using vertical multicolored lines.
In just a few years, voice cloning (VC) has emerged as a significant factor in communications, both legal and illegal. It has become more affordable and far more convincing than outdated robotic voices. The possibilities with this type of cloning may be endless.

One beneficiary is the individual losing their voice who wants to use synthetic speech that sounds as close as possible to how they once sounded. Those who have never been able to speak could also reap huge benefits. But don’t overlook the scammers and stalkers who view this new technology as a great way to dupe innocent people. While cloning tech is evolving at a breakneck pace, few legal protections exist to control the rise of these deep fakes, and consumers should be wary. Very wary.

What is voice cloning?

Researchers can now create convincing, near-perfect voice duplicates or clones using a five-second snippet of a person’s voice. You read that rightjust five seconds. Using text-to-speech software (TTS) or artificial intelligence (AI) programs, researchers have already created highly convincing audio that can sound like anyone. The internet is full of websites that offer to demonstrate voice cloning for free.

With just a fragment of recorded speech, artificial intelligence programs can now extract unique inflections and tones, then transform those elements into full sentences scripted by AI. Some features, such as intermittent pauses in a human’s speech, still prove challenging, but the basics are now fast and cheap. Honest folks around the globe need to learn how this new tech might cause them harm.

The Federal Trade Commission (FTC) held its initial workshop on voice cloning earlier this year; it’s probably not the last. The landscape of voice cloning is evolving daily. Ten months ago, the federal agency referred to voice cloning as a phenomenon where “WOW meets OMG.”

Cloning for good

The WOW factor is jaw-dropping. Appropriately used, VC delivers a tremendous benefit to society. AI can give ALS patients or throat and head cancer survivors back their voices. Individuals who have never been able to speak can communicate audibly. Some programs even tell AI which words to emphasize.

Patients can bank audio recordings for future use. You bank yours every time you change your cellphone greeting, so voice files are not new. Now, patients can stockpile multiple sayings then broadcast them later with customized emotions.

Think of your voice tone and inflection as your acoustic fingerprint. Unlike other features that are unique to youpalm prints, iris scans or facial imagesvoice conveys a great deal of your emotion and can alter your words’ meaning.

Cloning to steal, hurt, or deliver fake information

While all forms of biometric identification are a potential source of fraud, VC may soon provide unlimited opportunities to deceive. The nation has just endured a lengthy national election. Imagine a voicemail that sounded exactly like a top candidate admitting to wrongdoing. Then envision it going viral just days before the polls close. Deep fakes, even when they’re deleted rapidly, have a way of lingering and damaging targets.

This sort of deep fake isn’t just a concern on the distant horizon; it is here today. Just ask House Speaker Nancy Pelosi (D-CA), who recently saw her own words slowed down by opponents seeking to portray her as drunk. This is just an example the most basic type of voice deception. Actual word substitutions will be far worse.

Scams and fraud opportunities are inevitable, and some already rely on recycled themes. Consider the infamous grandson scam where your alleged relative calls from jail or the hospital asking for money ASAP. A recorded voicemail sounds just like the victim’s grandson, making this scam more sophisticated and likely to succeed.

Another threat involves Business Email Compromise (BEC) fraud. That voicemail directing you to wire money to a new client sounds precisely like your CEO, but it could be a deep fake. Annual W-2 forms are also a top target for BEC; they generate millions each year in income tax fraud.

Consider what a rejected boyfriend with a desire for revenge can do with voice cloning. Deep fakes leaked online could destroy their former partner’s reputation or even put their life in danger.

Unauthorized impersonations, banking fraud, theft of intellectual property and stock price manipulations are all possible. Using cloned voice commands to unlock devices or even access your digital home assistant to read your texts and emails are now within a scammer’s or snoop’s reach.

The end of trust, credibility and reputation

What’s at stake here? When misused, voice cloning could ruin someone’s business or personal reputation. It could also attack their credibility and cause significant mental and/or financial harm. Deep fakes that reach an employer’s eyes or ears could cause massive career headaches.

Then there is the matter of trust. If we cannot trust the voice on the phone that sounds like a niece, nephew or long-lost friend, what can we trust? Trust is easy to lose but incredibly challenging to regain. Unless science makes giant leaps very soon, this new world could make us all a little bit less trusting and more cynical.

Is manipulating someone’s voice ethical?

Nope. Compromising someone’s voice for nefarious purposes is not legal. However, the nation may need new laws or regulations on voice cloning ASAP.

Congress has been contemplating laws to govern cloning, but hackers don’t care about fines or penalties when they can simply avoid detection. You need to question every voice you hear—leven familiar voices, especially if they are asking for something out of the ordinary. And, when possible, ask questions that only the correct individual can answer. Insist on video chat if something seems amiss.

Researchers are working on algorithms that will separate fakes from genuine speech, but these solutions are not yet available while scammers are already out there hard at work. As always, never send a dime to anyone who might be impersonating a friend or relative. Call your contacts directly to confirm their situation instead. Voice cloning tricks often employ the same kinds of tactics that cybercriminals are currently using to get information over email, so the same tips for spotting phishing scams can also help identify deep fakes as well.

IDShield is a product of Pre-Paid Legal Services, Inc. d/b/a LegalShield (“LegalShield”). LegalShield provides access to identity theft protection and restoration services. For complete terms, coverage and conditions, please see an identity theft plan. All Licensed Private Investigators are licensed in the state of Oklahoma. This is meant to provide general information and is not intended to provide legal advice, render an opinion, or provide any specific recommendations.

ESS

Related Post

man in handcuffs

What is Criminal Identity Theft?

Criminal identity theft happens when someone who is cited or arrested for a crime uses someone else’s personal information as ID. The result: a criminal record now exists in the name of the person whose ID was hijacked. This can have all sorts of repercussions, and...

Thief stealing a Social Security Card from a wallet

What is Synthetic Identity Theft?

Synthetic identity theft occurs when a fraudster combines real and fake personal information to create a new identity—as opposed to stealing an existing one.  A thief can use a real Social Security number combined with a fake name and date of birth to ‘Frankenstein’...