If scams feel harder to spot lately, that’s not your imagination. Artificial intelligence has changed how fraud works—making messages smoother, faces more familiar, and stories more convincing than ever before.
As AI tools become more powerful and widely available, scammers are using them to blur the line between authentic and artificial. According to Chainabuse, a leading reporting platform for malicious crypto activity worldwide, reports of AI-enabled scams surged 456% between May 2024 and April 2025, compared with the same period the year before. That earlier period had already seen a 78% increase over 2022–23. That means the most dramatic spike wasn’t just more scams, but scams supercharged by AI tools that make fraud faster, cheaper, and more believable. The financial impact is staggering: Americans lost $12.5 billion to fraud in 2024.
Why AI-Enabled Scams Are Different
Generative AI can produce realistic text, images, audio, and video at scale. For scammers, that means fewer obvious mistakes and greater reach. Instead of clumsy emails or generic scripts, bad actors can now create polished messages, believable personas, and lifelike videos—fluently translated and tailored to a specific audience.
AI agents have amplified the volume and sophistication of scams. They operate with minimal oversight to scrape public information, personalize outreach, manage fake customer support channels, and test which tactics work best. What once required large teams can now run faster, cheaper, and with far greater precision.
Impersonation Scams Are Becoming More Personal
AI is increasingly used to impersonate executives, coworkers, friends, and family members. Working from only a short snippet of audio, scammers can replicate a person’s voice convincingly. In some cases, employees have transferred large sums after participating in what appeared to be legitimate video calls with senior leadership.
In June 2025, scammers used deepfake videos of a well-known doctor promoting an investment opportunity. One of the scam’s victims was 82-year-old Maurine Meleck of Florida, who lost $200,000 she had saved for her grandson with autism. “I handed nearly every penny I have over to a fraudster, while believing I was growing my nest egg for my grandson,” she said.
Using AI to Fight Back
But there’s good news too. AI is also being used to detect and disrupt fraud. Financial institutions and investigators use machine learning to trace stolen funds and identify emerging scam patterns more quickly than manual methods.
Public awareness is just as important. If you suspect you’ve encountered a scam, file a complaint at Fraud.org and at ReportFraud.ftc.gov. It takes just a moment, and it helps authorities track patterns and protect others.
Protect Yourself
You don’t need technical expertise to protect yourself.
The following habits will help:
Slow down when urgency is used to pressure you
Verify requests for money through a second channel
Be skeptical of endorsements—especially involving cryptocurrency
Remember that text, audio, and video can all be manipulated
AI may be making scams more convincing, but awareness remains one of the strongest defenses. For those deceptions that are less easily detected, there are services like Melaleuca’s InfoGuard Advanced Identity Protection that monitor your data and privacy 24/7. The better we understand how AI can be misused, the better equipped we are to protect ourselves—and each other.


