If you haven’t fallen for a scam yet, it’s not because you’re too clever — it’s because you’re lucky, says Nick Stapleton. Eventually, the right scam will get you on the wrong day.
The investigative journalist and presenter of the BBC television series Scam Interceptors is renowned for battling with scammers as he races to thwart real life online fraud attempts on the show, revealing increasingly sophisticated techniques — and tech — that cyber criminals have at their disposal. He can see, he adds, how artificial intelligence is taking scams to the next level.
An estimated 9mn people in the UK fell victim to financial fraud in the past year, according to Citizens Advice, with fake debt advice and friend-in-need scams claiming the most victims.
9mnEstimated number of people in the UK who fell victim to financial fraud in the past year
“Scams are basically the most profitable thing on the planet right now,” claims Stapleton, whose new book, How to Beat Scammers, came out this week. It reads like a compendium of modern-day scams — and is full of staggering insights into this hidden multibillion-dollar global industry (if we can call it that).
FT readers are among those who need to watch out, he says. Well-funded criminal networks are increasingly targeting wealthier victims, “going after much bigger amounts of money”, and using deepfake videos and generative AI to lure them in convincingly.
A few days ago, Stapleton says he was approached by a man who was “six figures deep” into a cloned website he thought was being run by a genuine British hedge fund. Victims, he says, are often young professional men who have one thing in common — they thought they were too smart to fall for a deception.
In fact, investment scams account for just over one quarter of all authorised push payment (APP) fraud cases, where victims are duped into directly transferring money to a scammer. The amounts of money lost by individual victims are sizeable, with the average being over £25,000. Yet scammers are using AI to push the limits of this fraud even further with “pig butchering”, which Stapleton describes as “without doubt, the single most profitable and fastest-growing scam on the planet”.
So-called pig butchering scams start with a bond of trust forged between the scammer and a wealthy victim who, after months of careful grooming and manipulation, is wiped out with a carefully staged investment fraud.
“You could be a member of some kind of legitimate web forum or group chat about investing, and another member from within that group contacts you,” he says. After months of fairly innocuous messages back and forward, the victim’s trust is gained.
Perhaps the scammer poses as a financial adviser or somebody running an investment firm — which is how six-figure victims became ensnared. In reality, their interlocutor will be thousands of miles away, and may not even speak English, due to the widespread availability and low cost of generative AI and automatic translation software. This is redrawing what Stapleton terms “the scam map of the world” from India and west Africa to south-east Asia and the Middle East.
As with romance scams, fraudsters rarely ask for money directly — that would arouse too much suspicion. They float a plausible scenario and wait for the victim to bite and say they want in.
To snare wealthy victims, organised crime groups invest in creating highly convincing fake websites. Scammers take custody of funds, giving victims realistic looking returns on their on-screen account balance. “They can often take money out, and feel confident that they’re actually making the profits that they’re seeing,” Stapleton says. As months go by, they invest more money. Until one day, their login stops working.

The big-ticket nature of these scams means there’s a danger victims will fall outside the scope of new UK bank fraud reimbursement rules, now capped at £85,000. If they are groomed into setting up a crypto account in their own name, transferring funds that are later stolen, banks typically refuse to pay out. Just 53 per cent of the £56.4mn lost to investment scams in the first half of 2024 was refunded to victims, according to the latest UK Finance data.
Stapleton also predicts that CEO scams, where subordinates are coerced into sending money by someone posing as their boss, will become more common, aided by the rise of deepfake video. Currently CEO fraud represents just 1 per cent of total APP cases by volume, but have the highest average losses of just under £50,000.
“People are no longer able to distinguish what is real from what is AI — and that is a very scary place to be,” he says. In the case of “granny traps”, scammers use AI images as bait on social media to find older victims who struggle to tell the difference (see box).
Through his television work, Stapleton has traced the origins of UK fraud attempts, witnessing how scams occur on an industrial scale in India. Often, he says, legitimate outsourcing businesses working for Western clients covertly run a “scam floor” as part of their business activities, which makes it much harder to separate and detect.
“The sheer size and blasé nature of the very large scam call centres is the thing that continues to strike me,” he says. “We’re not talking about a couple of guys working in a sweaty room on the edge of town. One employed hundreds of people as a massive public-facing entity with its own IT department and payroll — on the face of it, a completely normal business.
The financial rewards of cross-border fraud are high, but what Stapleton finds the most shocking is that the likelihood of getting caught is virtually nil.
He is struck by scam bosses’ “complete lack of concern when we confront them, and reveal the amount of information we have about them, how long we’ve been watching them for . . . they just know that there’s no chance that they’re ever going to get done, and it’s an upsetting reality.”
If one country decided to mount a crackdown, the virtual nature of financial crime makes it easy for scammers to move their operations. In his book, Stapleton cites UN estimates that almost a quarter of a million people may be “working” as indentured labour in scam compounds in south-east Asia, trafficked from China, Vietnam and Cambodia on false pretences, tempted by job adverts for telesales positions. Using forced labour and low-cost technology to unlock rich pickings from Western victims makes this vile crime even more profitable.
The complexity of scam detection and prosecution means the Western world has focused on prevention. Stapleton is pleased that UK banks have upped the ante, but notes that in the vast majority of scams, “their involvement in it is the last two metres of a 100 metre sprint”.
But with more than 70 per cent of scam attempts originating online, Stapleton says social media platforms are not doing anywhere near enough.
“They just don’t care,” he says. “They’re paying lip service to doing something about this problem. From the recent step back in content moderation, to a blue tick being something you can buy . . . I am not confident President Trump sees [fighting fraud] as a priority.”
The UK’s Online Safety Act, due to come into full force in the second half of 2025, will increase obligations on social media firms, but stops short of putting them on the hook for a share of fraud losses.
Nevertheless, social media firms, banks and telecommunications companies are getting better at sharing information, says Ben Donaldson, managing director of economic crime at UK Finance, the banking industry body.
“To make it harder for fraudsters to target people in this country, we need to do more to stop scams at source, sharing information in both directions to simultaneously take down fake websites and social media profiles, and the bank accounts and phone numbers associated with that fraud,” he says.
Finding a better mechanism at a cross-sector level to share this information with each other, and with law enforcement is the key. “We can’t be proactive without intelligence, and all agencies have a part to play in bringing the information together in the right place to deny criminals whatever service they need to commit that crime.”
Stapleton believes that until the UK creates a more effective centralised reporting system to stop scammers in their tracks, upping the level of “scam literacy” among the general population — including schoolchildren — is the only answer.
“We need to have a wider conversation as a country about the fact that we’ve been far too complacent about this for far too long.”
How to Beat Scammers: The Complete Guide to Keeping Yourself Safe from Fraud was published this week by Michael O’Mara Books.
Four ways AI is taking scams to the next level
‘Pig butchering’ scams
So-called pig butchering scams originated in China, where they are called “Sha Zhu Pan” (literally translated as “pig killing plate”). Scammers target wealthy individuals and play a long game, spending months fattening the “pig” before killing it (Interpol recently objected to this description, claiming it shames and stigmatises victims). These kinds of scams begin with emotional manipulation — which could be romantic, or developing a relationship of trust with someone posing as a professional — with generative AI and translation software making text-based messaging more convincing. Eventually, victims are lured into an investment scam, often involving the use of cloned websites and fake trading platforms.
The granny trap
Scammers post AI-generated images on social media that are designed to pull at the heartstrings — an animal in need of care, or an incredible artwork made by someone, for example. Often captioned “No one appreciates this”, these posts will generate comments from well meaning ladies of a certain age who think the image (and sob story attached) is real. Their lack of tech savvy flags them as a potential soft target for scammers to exploit further. A scammer using a fake account will politely respond in the comment field: “What a lovely person you seem, I’ve had a look at your profile and I’d like to send you a friend request.” Often, the scammer’s profile bears a fake image of a trustworthy person, such as an army veteran. This connection could be the start of a lucrative romance scam attempt.
Deepfakes
A growing number of investment scams and purchase scams involve creating deepfake videos, using AI to clone speech or moving images of celebrities or trusted figures from the world of finance who appear to endorse an investment or product that turns out to be fraudulent.
Last year, one man lost £75,000 after watching a deepfake video on social media of Elon Musk and the money saving expert Martin Lewis appearing to endorse a non-existent cryptocurrency investment scheme. Stars of TV shows Dragons’ Den and The Apprentice are routinely targeted. While social media platforms say they will use facial recognition to spot and take down fake ads, they don’t have to stay up for long to gain traction. Stapleton says giveaways include odd mouth shapes that don’t match the words you are hearing, and a lack of blinking — though this will get harder to spot as the technology progresses.
CEO fraud
Deepfakes have started to be used in highly targeted corporate raids. Last year, UK engineering firm Arup lost $25mn (£20mn) after a staff member in Hong Kong was persuaded to make 15 bank transfers after scammers digitally cloned the company’s chief financial officer on a video conference call. Scammers can use information available online about an organisation’s structure and reporting lines to research and identify targets. A senior staff member demanding urgent payment of a late invoice is a common ruse. These scams may be even more convincing if company email addresses are hacked, with messages appearing to come directly from senior staff (scammers ensure any replies never reach the real person). Other CEO frauds involve employees being ordered to purchase vouchers — allegedly as bonuses for staff — and then unwittingly sending the codes to scammers.