AI: they clone the voices of your loved ones to scam you, here’s how to protect yourself


Imagine you receive a phone call informing you that someone close to you has a problem. Your instincts would probably tell you to do anything you can to help, including making a wire transfer.

Scammers are aware of this Achilles heel and are now using AI to exploit it.

An article from washington post features an elderly couple, Ruth and Greg Card, who have fallen victim to an identity theft scam of this nature.

$11 million evaporated

Ruth, 73, received a call from someone she thought was her grandson. He told her he was in jail, without a wallet or cell phone, and needed money fast. Like all worried grandparents, Ruth and her husband (75) rushed to the bank.

It was only after going to a second bank that the manager warned them that he had seen a similar case before which turned out to be a scam. And that their story looked like a scam too.

And this is not an isolated case. The article states that in 2022, impersonation scams are the second most popular type of scam in the United States, with more than 36,000 people falling victim to calls impersonating friends or their family members. Of these scams, 5,100 occurred over the phone. What offload 11 million dollars to the victims, according to the FTC.

Sound virtually indistinguishable from the original source

Generative AI has been all the rage lately due to the growing popularity of programs such as OpenAI’s ChatGPT and DALL·E. These programs were quickly associated with their ability to increase user productivity.

However, the same techniques used to train these linguistic models can be used to train more harmful programs, such as AI voice generators, for harmful purposes.

These programs analyze a person’s voice and find out what makes up the unique sound of their voice, including pitch and accent, and then recreate it as a synthetic voice. Many of these tools work in seconds and can produce sound that is virtually indistinguishable from the original source.

What you can do to protect yourself from synthetic voices

So what can you do to avoid falling into the trap? The first thing to do is to be aware that this type of call is possible.

If someone close to you calls for help, remember that it could very well be a robot talking for them. To make sure it’s a relative, try verifying the source.

  • Try asking the caller a personal question that only your loved one would know the answer to. It could be just asking him the name of your pet, a family member, or some other very personal fact.
  • You can also check the location of your loved one to see if it matches where it says it is. Today, it is common to share your location with friends and family, and in this case, it can be very useful.
  • You can also try calling or texting your loved one from another phone to verify caller ID. If your loved one picks up or texts and doesn’t know what you’re talking about, you have the answer.

Finally, before making any major financial decisions, consider contacting the authorities for advice on the best course of action.

Source: ZDNet.com





Source link -97