The FBI reports a 400% increase in AI voice cloning scams in 2026, with criminals using just 3 seconds of audio to create convincing replicas of victims voices to defraud family members and businesses.

How It Works

Scammers scrape voice samples from social media videos, voicemails, and phone calls. AI tools can generate a realistic voice clone from as little as 3 seconds of audio, enabling real-time fake phone calls.

Common Scam Scenarios

How to Protect Yourself

Establish a family safe word that scammers would not know. Always hang up and call back on a known number. Never send money based on a phone call alone, no matter how convincing the voice sounds. Be cautious about posting voice content on social media.