AI voice fraud: Grandchild trick with deceptively real voice fakes


CURRENT FRAUD ALERTS

Scammers are taking the grandchild trick to a new level with language fraud. With the help of AI, deceptively real voices of your relatives lure you into a trap and demand money.

Beware of voice messages from strangers. (Source: netzwelt)

Cyber ​​criminals use artificial intelligence (AI) for a perfidious new variant of the grandchild trick. The supposed relative does not contact you via WhatsApp chat, but by phone or voice message. A relative’s voice, which sounds deceptively real, then demands money from you under an urgent pretext.

For this voice scam, scammers clone voices. Language files published on the Internet serve as the source. Three seconds of audio material is sufficient for cloning. An AI analyzes the material and is then able to manipulate the voice of family members or you. The criminals feed the AI ​​with the appropriate input and it suddenly sounds on the phone as if your son or sister needs money immediately.

According to a McAfee survey, 75 percent of 1,007 respondents have already fallen for such a scam and lost money. But you can also recognize this attempted fraud and protect yourself. As with phishing emails, do not trust unknown senders. Therefore, it is better to hang up when you call and contact the person on the number you know. Even if you ask questions that only the right person can answer, you can expose the scammers.

Also interesting…

Also, listen carefully to see if it really sounds like the supposed person. The AI ​​programs mostly work in English, which is why a German voice output is not yet error-free. However, it should not be long before the language fraud sounds deceptively real in this country as well. So be extra vigilant.

Don’t miss anything with the NETWORK-Newsletter

Every Friday: The most informative and entertaining summary from the world of technology!



Source link -67