The rapid rise of generative artificial intelligence is providing new ground for cryptocurrency scammers to carry out their illicit activities, which remained at elevated levels in 2024.
Cryptocurrency scams generated at least $9.9 billion on-chain, meaning those that occur on the blockchain and have been reflected on the public ledger, figures released by Chainalysis on Thursday showed.
“Generative AI is amplifying scams, the leading threat to financial institutions, by enabling high-fidelity, low-cost and highly scalable fraud that exploit human vulnerabilities,” Elad Fouks, head of fraud products at Chainalysis, wrote in the report.
Those scams were led by one particular scheme that took advantage of the emotions of unsuspecting victims, know as pig-butchering.
Pig-butchering scams
Pig-butchering refers to scams where fraudsters earn their victims' trust, usually through a fictitious romantic relationship, before duping them into making investments in bogus cryptocurrency projects.
Revenue made by such scammers increased by almost 40 per cent annually last year to roughly $3.3 billion, Chainalysis data shows.
Perpetrators have also evolved to diversify the scam model, which initially focused on a long-term scheme. It took months or even years for a scammer to develop a relationship with their victims and make the latter agree to invest in fake projects.
Increasingly, con artists have developed methods with quicker turnarounds, such as employment or work-from-home scams, which typically yield smaller victim deposits yet add up to significant amounts.
The numbers reflect this shift: deposits to pig-butchering scams rose by nearly 210 per cent annually in 2024, possibly indicating an expansion of its victim pool, Chainalysis said. Meanwhile, the average deposit amount decreased 55 per cent year-on-year, and this combination of lower payment amounts and increased deposits could highlight a change in strategy, the company added.
"We can’t allow ourselves to be easy targets ... always remain sceptical of unsolicited messages or 'fabulous' investment opportunities: if it seems too good to be true, it most likely is," Ivan Milenkovic, a vice president at California-based IT security firm Qualys, told The National.
Pig-butchering has its roots in large-scale scam operations in South-East Asia, the Chainalysis study said. International Justice Mission, a global body that protects people in poverty from violence, began observing forced labour cases tied to these operations in 2021, and has since recorded immense growth of these crimes.
However, this type of scam has become more geographically dispersed, especially in the past two years. One example in December 2024 was the arrest of 48 Chinese and 40 Filipino nationals in Nigeria on charges of running a crypto scam operation that aimed at people mostly from Europe and the Americas.
Interpol last year also co-ordinated a global operation to stop such scams, including one in Namibia that forced 88 youths to participate as part of an international ring. Peruvian authorities also rescued 43 Malaysian citizens trafficked to Peru who were forced to work in a scam operation.
"If someone is offering you something too good to be true [it could be a] prelude to an attack," Morey Haber, chief security adviser of US cybersecurity company BeyondTrust, told The National.
The Chainalysis report also highlighted high-yield investment scams (HYIS), where fraudsters promise a high return for your money. HYIS accounted for more than 50 per cent of crypto among scam sub-classes, although inflows dropped by 36.6 per cent annually, the report said.
Crypto drainer scams, where assets are siphoned out of wallets, and rug pull scams, where scammers abruptly abandon projects and run away with amounts invested with them, were also on the rise last year.
Crypto ATMs, which have been around for more than a decade, have also become a weapon of choice for scammers: the Federal Bureau of Investigation has said that it has received thousands of reports about cyber criminals using crypto ATMs to receive payouts for scams, leading to a tenfold surge in funds lost in the US.
How is AI being used for fraud?
There are several AI-based techniques that fraudsters use, including language translation, where threat actors can send e-mails, text messages or voice mails, with flawless grammar in almost any language.
"This circumvents the best content filters, even if they are AI-driven, by eliminating detections based on poor grammar, spelling, or other questionable content," Mr Haber said.
Then there are deepfakes, most notably used in images and videos that mimic real-world personalities and situations, giving the impression that they are real.
These techniques are making it tougher for authorities to crack down on crypto scammers. This has led to heightened calls for more collaboration to combat illicit activity in the growing sector.
Generative AI's "realistic fake content, including websites and listings, to power investment scams, purchase scams and more, make attacks more convincing and harder to detect", Mr Fouks said.