Uncategorized

This is how you can tell when the son in need is an AI

AI speech generation software has come a long way in recent years. While it initially sounded like a tinny, monotonous computer voice, today’s AI voices can no longer be distinguished from the human original. This is what scammers take advantage of.

As the Washington Post reports, more and more people are falling victim to phone scammers who use voice AIs. Elderly people who claim to be family members in need are particularly at risk. The stories of the scammers are sometimes very unbelievable.

As the Washington Post depicts, an elderly couple transferred $15,000 to a scammer through a bitcoin terminal, believing they had spoken to their son. The AI-generated voice informed them that he had to pay legal fees after he was involved in a car accident that killed a US diplomat.




Damages in the millions

Like the Federal Trade Commission communicates, so-called impostor scams are widespread. More than 5,000 people were scammed this way in the United States in 2022. The victims lost a total of around eleven million US dollars.

Originally, the AI ​​language models were developed to advance the generation of speech from text. In this way, new possibilities for language processing were created, which are also used in the film industry, for example, since famous voices can be easily imitated.

However, the almost real-looking voices are also used by scammers, which prompted the FTC to publish AI guidelines as early as 2021, telling companies that the language models should do more good than harm and that companies should be prepared to accept risks in the responsible for the use of products. Because currently nobody knows who is to blame if the technology is misused.




How can users protect themselves?

Not every call has to be planned, but a short message by email, SMS or via messenger like Whatsapp announcing an imminent call takes away the suspicion of a fake call from the other party. A simple “Hey, can I give you a call?” message is enough.

Don’t rely on numbers. These can also be faked. For example, if it appears to be your bank, a quick call back to make sure it’s not a scammer can’t hurt. Important information such as social security number, home address, date of birth, telephone number and bank details should only be passed on on the phone if it is really clear that there is no AI at the other end, but a serious person. Even middle names, parent names, and even pet names should be treated with care. These names are often the subject of security questions.

One last key piece of advice is listening. Especially when scammers pretend to be close confidants, it is important to listen carefully. What is the caller’s choice of words, does he react unusually to queries – if there are atypical behavior patterns here, it is possible that it is a matter of fraud.

Almost finished!

Please click on the link in the confirmation email to complete your registration.

Would you like more information about the newsletter? Find out more now

Leave a Reply

Your email address will not be published. Required fields are marked *