The closing date for Raegan Bartlo’s new home in West Virginia was approaching, so she wasn’t surprised when she received an email on August 23 that appeared to be from the title company’s attorney with instructions to transfer a $255,000 down payment. to make.
She sent the money to the account listed in the email, assuming everything was fine with her planned move from Alexandria, Virginia. Two days later she learned that she had actually transferred the money to a scammer.
She said the scam email was written by a computer software program powered by artificial intelligence.
“It was terrifying. You feel violated. I thought we had lost everything,” said Bartlo, who previously worked on cybersecurity issues at the Analysis and Resilience Center for Systemic Risk, where she learned to recognize the hallmarks of a scammer’s email.
This one, she said, was different, with a conversational yet professional tone and no grammatical red flags — features that cybersecurity experts say are typical of AI-generated text.
Across the country, cybersecurity officials say, scammers are using generative AI programs to pose as real estate agents, lenders or other parties to a home sale — mimicking their writing style in emails or their voice in voicemails to trick unsuspecting recipients into to transfer money to controlled accounts. by the scammers.
These schemes are “almost a perfect crime,” said Tom Cronkright, co-founder and executive chairman of CertifID, a real estate fraud prevention company. Huge sums of money are at stake, and buyers often want to move quickly in a tight housing market.
Because most buyers and sellers don’t have much experience with these transactions, they will typically take the lead from their agents, lenders and the title company – or from people pretending to be them.
“You throw generative AI on top of it, and you say, ‘My agent called me, I got this voicemail, I read this email, and yes, I connected it because I’m so fatigued by this process that I I will do everything I can to make sure it closes,” Cronkright said.
During a telephone interview, Cronkright recorded 13 seconds of a reporter’s voice and then used an AI program to have a voice clone pose as a real estate agent and instruct a customer to make a bank transfer. He also quickly typed a prompt to create an AI-written email with wiring instructions, usually in a formal tone, but with a golf joke thrown in for good measure.
Matt O’Neill, the former head of the US Secret Service who investigates cybercrime, said he first encountered this type of scam “almost the day after ChatGPT was released” in late 2022.
Scammers can use various techniques, such as hacking accounts and phishing real estate professionals to obtain transaction data, in order to obtain the information necessary to carry out their scams. They can also use publicly available information from real estate databases and listings to add persuasive details to their scam emails, for example the paint color of the master bedroom, O’Neill said.
Free AI programs like ChatGPT allow scammers to write better phishing emails, O’Neill says, — messages that trick recipients into clicking a link or attachment. “It lowers the bar for bad actors who have been in the game before because they don’t need advanced skills,” he said.
Two FBI officials with expertise in complex financial and cyber fraud said they have seen an increase in housing fraud cases involving AI.
Earlier, one of the officials said that fraudulent emails and text messages were often written by people with poor English proficiency, making the tone too formal or informal or leading to sentences with grammatical errors and awkward wording.
Now, the official said, “asking a generative AI program to come up with a business-oriented email that could convince someone that there was a problem with its closure, and that he or she should change bank accounts, is very easy to generate, and can be written out by the software within seconds.”
The FBI has no data on housing fraud involving AI, the official said, because the use of AI itself is not illegal and because it is difficult to prove that AI was used in a specific case.
But earlier this month, the FBI issued a public service announcement warning of the increasing use of AI in fraud schemes to generate persuasive text, audio, video and other communications. That followed a warning last month from the Treasury Department’s Financial Crimes Enforcement Network (FinCEN) about a recent increase in AI-generated “deepfakes” targeting financial institutions.
In 2023, the FBI received 21,489 complaints of “business email compromise” – email fraud involving fraudulent fund transfers – totaling more than $2.9 billion, according to the FBI’s annual Internet Crime Report. In terms of overall cybercrime, DC had more complaints and losses per capita than any other state. Maryland and Virginia were both in the top half of states. (The report does not specifically cite state-level data on compromising business email programs.)
It is impossible for Bartlo and other scam victims to know for sure whether their scammers have used AI. But her cybersecurity background taught her to recognize a human-written phishing email. Her scammer’s email, she said, didn’t read like an email.
“Honestly, it never occurred to me,” she said, of the possibility that the sender might not actually have the title of corporate lawyer. “The way the person did it, it was really good.” She later discovered that one digit was wrong in the email signature fax number.
Bartlo and her husband, Michael, were ultimately able to get more than half of their money back thanks to the cooperation of the banks on both sides of the transaction. But the couple still lost more than $112,000, forcing them to dip into their retirement savings before they were eligible, with no tax penalties.
“I keep paying for this criminal, and the criminal gets away with whatever he wants,” said Bartlo, 49.
But the incident didn’t stop them from buying the house in West Virginia’s Bunker Hill community, close to the Virginia border, where mortgage payments and living costs are much lower than what they encountered in Alexandria.
“It’s beautiful,” Bartlo said of her new home.
Three days before they finally closed on the house, Bartlo received an almost identical email to the first.
Once again it was courteous and professional and read as if written by the title of in-house lawyer. And again the message asked her to deposit money into a bank account.
This time, Bartlo said, she didn’t fall for it.