Skip to content

AI Scams: Fraudsters Use Realistic Voices to Trick Relatives

AI scams are on the rise, using realistic voices to trick unsuspecting relatives. Stay safe by asking questions and verifying identities.

This seems like a printer box and there is a paper is on that, there is a text "Stop talking" is...
This seems like a printer box and there is a paper is on that, there is a text "Stop talking" is written on the paper and there is an another paper placed on the table and there is a text " Fucking genius" is written.

AI Scams: Fraudsters Use Realistic Voices to Trick Relatives

Fraudsters are employing advanced technology to create convincing scam calls, using AI to mimic the voices of relatives or friends. These sophisticated 'AI scams' are targeting people of all ages, unlike traditional scams that often focus on the elderly.

AI scams involve scammers pretending to be in an emergency and asking for money. They use internet services to create and adjust audio files using voice regulators. Companies like Voisa.ai, Murf.ai, and Lovo.ai offer AI-based voice cloning services that can generate highly realistic synthetic voices, making it difficult to recognize AI-created audio files. However, gaps or unnaturalness in the conversation can serve as clues.

To protect yourself, do not give out personal information during the call. Instead, ask specific questions to verify the identity of the caller. Agree on a secret code word with close relatives to use during suspicious calls. If you suspect a call might be an AI scam, end the call and contact the supposed acquaintance or relative directly. Note down the date, time, and phone number for future reference.

AI scams are a growing threat, targeting people of all age groups. By staying vigilant, asking questions, and maintaining open communication with loved ones, we can all help combat this deceptive use of technology.

Read also:

Latest