Up to 9% of rental applications involve falsified information

Robot & human hands Shutterstock_2711973661

Why Rental Screening Must Evolve in the Age of AI

For rental housing providers, artificial intelligence has become both a challenge and a solution. Fraud in rental applications isn’t new, but the technology powering scams today is more accessible and more convincing than ever before. Industry surveys show that anywhere from six to nine percent of rental applications involve falsified or manipulated information, and that number climbs in markets with higher demand and limited inventory.

In multifamily communities where hundreds or thousands of applications are processed annually, even a seven-percent fraud rate can translate to dozens of bad tenants locked into leases, costly evictions, and weeks of lost rent. This is happening at a time when vacancy pressures and rising operating costs are already squeezing landlords.

Part of the issue is that AI has democratized fraud tools that used to require skill. “We have already seen an impact on fraud from AI,” said Akaash Gupta, Head of Tenant Screening Partnerships at Nova Credit. What used to require knowledge of editing software or specialized forgery tools can now be generated with a few prompts. That shift has made fraudulent tactics both more common and harder to detect.

The most prevalent form of rental fraud historically involved doctored pay stubs, altered bank statements, or embellished income figures designed to meet minimum qualifying ratios. These documents were often easy to spot for experienced leasing agents but could get past less seasoned reviewers.

Screen Your Tenant Today!

Gain peace of mind with AAOA’s credit, criminal, and eviction reports.

 

Today, AI can reproduce realistic pay stubs with professional fonts, digital signatures, and plausible transaction histories. On top of falsified documents, housing providers are now seeing fraudsters manufacture entirely fake identities with fabricated names, credit histories, and employment records that appear genuine at first glance.

“The largest amount of fraud has to do with falsifying documents but now we are seeing completely fake identities,” Gupta said. In several pilot programs reported by screening vendors, up to 15 percent of flagged applications involved identity fabrication rather than simple document manipulation.

That evolution makes detection more complicated. “Before, it took someone who was rather sophisticated to fake documents, now AI can generate those images really easily,” Gupta said. The tools themselves — large language models and image generators — are now designed to produce output that looks professional and well formatted, which can lull reviewers into a false sense of security. Some fraud cases have even employed AI-generated voice calls or text replies to mimic legitimate applicants during leasing follow-ups.

To meet this threat, industry leaders emphasize that a single background check is no longer sufficient. Effective fraud prevention requires a layered approach that spans the rental process from initial inquiry to move-in.

“You need to check for fraud at various times in the rental process,” Gupta said. That means screening on submission of an application, during income verification, and again after conditional approvals, especially if additional documentation is provided. Many operators are layering tools such as real-time credit checks, device fingerprinting to detect suspicious application patterns, and IP address monitoring to identify bot or VPN use.

Beyond pattern analysis and AI detection models, pulling information directly from original sources has become a best practice. “AI can detect when documents are fake but we like to pull information directly from the financial institutions or payroll company, that avoids the possibility of them doctoring anything,” Gupta said.

By connecting to live feeds from banks, employers, or payroll processors, landlords can verify that deposit schedules, paycheck amounts, and job titles correspond with real-time data. This drastically reduces the ability of fraudsters to slip through by submitting screen-captured or regenerated documents.

Technology itself is also helping to close the gap on false positives. Screening platforms are now using machine learning models trained on real application outcomes to improve accuracy, while natural language processing can analyze text-based answers for inconsistencies.

Some multifamily operators are using automated video verification, where applicants record themselves answering preset questions, and algorithms compare the facial analysis against submitted ID photos. This isn’t perfect, but when combined with other verification layers, it substantially raises the bar for fraudsters.

Importantly, the focus on technology in rental screening isn’t only about keeping bad actors out. It’s also about improving fairness and expanding access for qualified renters. “It isn’t only about keeping bad guys out, it is about increasing access to those who deserve it,” Gupta said, referring to applicants who have been historically underserved by traditional systems.

Standardized, digital screening processes offer consistent evaluations that reduce subjective decision-making and uneven outcomes. In one survey of leasing agents before and after implementing digital screening tools, applicants from similar backgrounds received disparate decisions in nearly 20 percent of cases when reviews were manual, compared to less than five percent under a standardized model.

“Different leasing agents might have completely different conclusions. By using the same model it ensures people are being treated the same,” Gupta said. That consistency matters not just for fairness but for compliance with evolving fair housing expectations. With a digital audit trail, landlords can demonstrate that decisions were based on objective criteria rather than individual bias.

As AI continues to redefine the landscape of rental housing, its influence will only grow. It enables new fraud techniques but also powers more sophisticated defenses. The operators who stay ahead of the innovation curve will be best positioned to protect their communities, minimize losses, and ensure that qualified applicants aren’t turned away unfairly.

Source: Propmodo