The FBI reports a 500% increase in AI voice cloning scams where criminals create real-time voice replicas of family members to demand emergency money transfers, with victims losing an average of $15,000 per incident.

How It Works

Scammers need just 3 seconds of audio — from social media videos or voicemail greetings — to clone a voice convincingly. They call parents pretending to be their children in distress, demanding immediate wire transfers.

Protection

The FBI recommends establishing a family code word that would be requested in any emergency money situation. Voice verification apps are also emerging to detect AI-generated calls.