Mumbai: AI voice scams are becoming common. AI voice scams use technology to mimic someone’s voice. The scammers target victims by posing as family, friends, or even customer service representatives. They aim to trick people into sharing personal info or sending money.
Common AI voice scams include:
Impersonating a family member or friend: The scammer pretends to be a relative in trouble. Then ask for money. They use familiar names to make the scam more believable.
Impersonating a customer service representative: Scammers claim to be from a company the victim deals with, like a bank, asking for personal info or payments.
Posing as a government official: Scammers pretend to be from agencies like the IRS, threatening legal action if the victim doesn’t comply.
Also Read: IRCTC introduces new tour package: Details
Tips to avoid falling prey to AI voice scams:
Never share personal info on the phone unless sure of the caller’s identity.
Be cautious if someone urgently asks for money or personal details.
If in doubt, hang up and call the company directly.
Stay informed about the latest scam tactics, as scammers regularly change their methods.
Report any suspicious activity to the authorities immediately.
These scams are a reminder to stay vigilant and double-check when faced with urgent requests, especially if they involve money or personal information.
Post Your Comments