AI Scams: Fraudsters Use Realistic Voices to Trick Relatives
Fraudsters are employing advanced technology to create convincing scam calls, using AI to mimic the voices of relatives or friends. These sophisticated 'AI scams' are targeting people of all ages, unlike traditional scams that often focus on the elderly.
AI scams involve scammers pretending to be in an emergency and asking for money. They use internet services to create and adjust audio files using voice regulators. Companies like Voisa.ai, Murf.ai, and Lovo.ai offer AI-based voice cloning services that can generate highly realistic synthetic voices, making it difficult to recognize AI-created audio files. However, gaps or unnaturalness in the conversation can serve as clues.
To protect yourself, do not give out personal information during the call. Instead, ask specific questions to verify the identity of the caller. Agree on a secret code word with close relatives to use during suspicious calls. If you suspect a call might be an AI scam, end the call and contact the supposed acquaintance or relative directly. Note down the date, time, and phone number for future reference.
AI scams are a growing threat, targeting people of all age groups. By staying vigilant, asking questions, and maintaining open communication with loved ones, we can all help combat this deceptive use of technology.
Read also:
- Web3 gaming platform, Pixelverse, debuts on Base and Farcaster networks
- Humorous escapade on holiday with Guido Cantz:
- Expands Presence in Singapore to Amplify Global Influence (Felicity)
- Amazon customer duped over Nvidia RTX 5070 Ti purchase: shipped item replaced with suspicious white powder; PC hardware fan deceived, discovers salt instead of GPU core days after receiving defective RTX 5090.