The Federal Bureau of Investigation issued an alert today warning businesses about a surge in deepfake voice scams targeting corporate executives. The agency said it has received over 400 reports since January of scammers using AI-generated voice cloning to impersonate CEOs and CFOs, instructing employees to wire funds or share sensitive financial information.

In the most sophisticated cases, the cloned voices are nearly indistinguishable from the real executives, using speech patterns and cadences derived from publicly available earnings calls, conference presentations, and media interviews. The FBI said losses from these schemes have exceeded $75 million in the first quarter of 2026 alone, with individual incidents ranging from $50,000 to $8 million.

The bureau recommends that companies implement multi-factor verification for any financial transaction requested via phone, including callback procedures using independently verified numbers. Establishing code words or phrases that change regularly between executives and finance teams can also help detect impersonation attempts. Companies are advised to limit the availability of executive voice recordings online where possible and to train employees on the growing sophistication of AI-powered social engineering attacks.