Apply, Lie, Move In: AI Is Making Rent Fraud Easier
Los Angeles landlord Michael Renkow already had a sinking feeling when one of the new tenants at his Hollywood apartment building, “Igor,” was two hours late to pick up the keys.
In September 2025, Igor had applied for two separate apartments, both running $5,300 per month, and had been approved. His bank statements, proof of employment and ID had cleared.
Two days later, after his bank alerted Renkow that the cashier’s checks used to pay first and last month’s rent were fraudulent, the feeling crystallized. Renkow raced back to his property, hoping to intercept the new tenants before they moved in. He found the front door of one unit pinned from the inside and, in the other, a woman who told him that she had established residency and didn’t intend to leave.
Ultimately, he discovered that the woman paid a fee to sublease the unit from the fraudulent applicant and that no one intended to pay him rent. He had to embark on a lengthy eviction process to regain access to his property.
“I’m pretty sure I’m going to sell the building. I don’t need this at 74 years old,” Renkow, a small-business owner who owns a handful of rental properties, told Bisnow. “It’s been so much stress. I’ve been sick for seven months.”
Apartment operators have for years flagged fraudulent rental applications as a thorny problem that reaches into already thin margins and can ultimately raise costs across the board if it becomes too pervasive.
More recently, artificial intelligence has made it easier to fabricate pay stubs, bank statements, identification and other documents landlords use to verify a prospective tenant’s identity and ability to pay. AI also makes fraudulent documents more difficult to spot, slowing the already difficult process of finding and catching fraudsters.
“This is something that we have been working with our members on for quite a while, to flag for policymakers that this is a deeply troubling movement and something that we all need to take seriously,” said Kevin Donnelly, executive director of the Real Estate Technology & Transformation Center.
“The cost ultimately gets borne by the community and negatively impacts affordability across the board,” he said.
Law enforcement and industry sources don’t have a comprehensive accounting of the annual costs of rental fraud, but the FBI Internet Crime Complaint Center tracked more than 12,000 cases of reported real estate fraud in 2025, and estimated real estate online fraud cost approximately $275M in 2025, nearly as much as credit card and check fraud.
The consequences can range from tenants simply living in an apartment where they technically don’t qualify to subleasing it to other renters, engaging in nonpayment or carrying out illegal activities in the unit.
In 2024, 93% of National Multifamily Housing Council members reported that they had experienced rental fraud in the previous year. Each case of application fraud costs an average of $15K to mitigate, according to Donnelly, whose organization is affiliated with the NMHC.
For Renkow, the fraud meant spending seven months on the eviction process, during which other tenants moved out of the building because of nuisances caused by the squatters. He estimated the entire episode cost him $90K.
The “avalanche” of fraud began during the pandemic era, when a pivot toward remote viewings and online approvals, as well as eviction protections, proved a “double-edged sword,” Donnelly said.
Social media also contributed to the spread of application fraud by connecting people making fake documents with those searching for them. Now, AI is making falsified documents easier to make and more convincing, according to Tim Anderson, a manager at MRI Real Estate Software.
While obtaining fake documents used to require specialized skills or was limited to those with fluency in navigating the so-called dark web to shop for counterfeit documents, advancing technology has made it as simple as sending a direct message or learning to manipulate generative AI.
MRI Real Estate Software purchased 200 AI-generated fake IDs — some for as little as $5 — and tested them against different verification methods. Optical card readers, the industry-standard system, only flagged 26%.
Scammers have also started to create better paper trails. In some cases, they are setting up real LLCs and issuing pay stubs from this real business, ostensibly creating legal documentation, according to Steve Carroll, co-founder and CEO of Findigs, an AI platform that automates rental screening
Renkow’s case in Los Angeles was the rare example where a fraudster was caught, charged and brought to trial. The key to cracking the case was that the fake qualifying documents used by “Igor,” identified as 38-year-old LA resident Alfred Earl Jackson, contained a fake name but his own actual photo, LA Police Department Detective Juan Campos said.
Rental fraud isn’t a common case for Campos. The LAPD gets more than 400 complaints about identity fraud per month. Without that photo ID and the luck in tracking him down, he likely wouldn’t have found Jackson. Otherwise, police departments simply don’t have the resources or the time.
Campos found that Jackson had gotten the new subletters to pay him $5K for the down payment on Renkow’s unit. Jackson and his accomplice may have also defrauded additional landlords.
Rising costs for tenant screening systems add one more expense layer to the risk management practice of a landlord and, thereby, tenants. And without much policy activity around preventing rental fraud, it can come down to preventive measures and more technology, according to Carroll.
Part of the struggle is the increasingly fast pace of the rental market, according to Chris Rankin, CEO and co-founder of Rent Butter, which offers screening services for workforce and affordable housing.
With more competition between landlords and mergers and acquisitions creating ever-larger portfolios, property managers are pushed to work faster and get tenants in units sooner.
Rankin framed it as a problem of not being able to get tenants in as fast as landlords want. As portfolios grow, it is harder to hand-check every application, and those on the edges don’t always get a fair shake. It is about streamlining operations and keeping out the bad actors, he said.
Some consumer and tenant advocates question whether the problem is as widespread as the industry claims. Much of the available data comes from the industry itself, according to Ariel Nelson, senior attorney at the National Consumer Law Center. She said the situation has been used to justify expanded rental‑screening technology that adds fees and can disproportionately harm certain renters.
“What is the actual scope of this problem? Is this just a justification to charge people money for applications?” Nelson said. “The screening industry has been hugely effective at convincing landlords that they need all of this information, but there is not good evidence suggesting that a bunch of the information on there actually speaks to the likelihood that someone is going to be a successful tenant.”
Tenant screening has grown into a multibillion-dollar industry offering to soothe landlord concerns about this type of crime.
The apartment industry has heavily lobbied for tenant screening and other AI tools to operate more efficiently. The NMHC supported RealPage and lobbied for a ban on localities regulating AI. The industry is focused on policymakers rather than the platforms themselves, Donnelly said.
“Unfortunately, we've seen a trend in policymaking circles to try to limit the use of AI or other tech in screening or other housing operations, and we find that to be extremely troubling, and that has really been front burner for us,” he said.
But with AI being used to both perpetrate and sniff out the fraud, the situation has become a cat-and-mouse game, with tech firms constantly gaming out new methods of fraud and trying to find fixes.
“It'll continue to be an arms race, as there's these tools that are used to create fraud, and there are these tools that are used to detect fraud, and I don't see that stopping,” Carroll said.