How AI-powered tenant screening harms renters in the Sun Belt states

After surveying landlords and tenants in California about their reliance on and experience with algorithmic tenant screening software in the rental housing market, we set our sights on Georgia and North Carolina to see how the experience compares in the Sun Belt. These were our findings.
April 9, 2025

In 2021, a property management company rejected Mary Louis’s housing application. Why? An AI-powered tenant screening tool assigned her a score of 324; she needed to get at least a 443 to qualify. That was all the company would tell her, and there was no option to appeal.

She admittedly didn’t have the best credit score, but she had a great recommendation from her landlord of 17 years, who said that she always paid her rent on time. Was that not enough?

Unfortunately, that’s not how tenant screening tech works. It flattens people. It’s a two-dimensional analysis of a three-dimensional world with complex inequalities. This flattening—in the name of convenience—costs people access to a fundamental necessity: housing. It also allegedly perpetuates discrimination against people of color and low-income people, a violation of the Fair Housing Act.

That’s why last year, TechEquity surveyed landlords and tenants in California who use and experience algorithmic tenant screening software in the rental housing market. With those findings, we published Screened Out of Housing: How AI-Powered Tenant Screening Hurts Renters

To better understand how tenant screening tools are being used across the country, we ran an additional survey in North Carolina and Georgia. These were our findings.

Starting in California

Our survey in California was the largest known survey of landlords’ algorithmic tenant screening practices in the country’s largest state. Our research found that:

  • 57.5% of the landlords received tenant screening reports that contained some AI-generated score or recommendation.
  • 16% of landlords reported receiving predictive information from screening companies.
  • Landlords who serve lower-income renters and smaller landlords used AI-enabled tenant screening the most.
  • Black and Latinx renters were almost half as likely to have their rental applications accepted as white respondents (46% and 43%, respectively).
  • Only 3% of renters could identify the screening company or consumer reporting agency that assessed their application. 

While our findings show this billion-dollar industry has deep roots in the heart of the tech sector, research indicates this technology may be concentrating in the Sunbelt. Why? There was a significant increase in both iBuying and private equity acquisitions of single-family homes following the Great Recession.

Findings from the Sun Belt

In 2024, the Government Accountability Office identified Atlanta, Raleigh, and Charlotte as three Sun Belt cities with among the highest percentages of corporate ownership of single-family homes. These corporate owners tend to lack a personal touch, relying on housing management tech—like tenant screening tools—to be more “efficient.”

Our surveys revealed that to be the truth, with 65% of Georgia landlords and 63% of North Carolina landlords receiving AI-enabled tenant screening reports. These figures may undersell the adoption of tenant screening technology, as other research on this issue indicates up to 90% of landlords now rely on AI-powered tenant screening reports to make rental application decisions.

What if these algorithms were fed inaccurate information? What if there is additional information—like a letter of recommendation stating you’re a reliable tenant—that the algorithm doesn’t account for?

Well, 36.5% of all landlords we surveyed followed the screening report recommendation without additional review, a practice followed by 37%, 38%, and 35% of landlords in California, Georgia, and North Carolina, respectively.

Minority Report is real—kind of

These reports also sometimes contain suspect scores, like ones “predicting” whether an applicant will break their lease, cause property damage, or fail to pay rent—guilty before even committing the act, like the Phillip K. Dick novel The Minority Report

17.4% of all landlords surveyed, including 16% of California landlords, 15% of Georgia landlords, and 25% of North Carolina landlords, received predictive analytics from tenant screening services. 

It’s important to note that the use of AI to deny people access to basic needs based on actions they have not yet taken is banned in the EU because of its suspect nature.

Renters left in the dark

Only 3% of the roughly 2,200 tenants surveyed across the three states could name the screening or consumer reporting agency that assessed them; the rest left the response blank or mistakenly provided the name of their landlord or property management company.

How are tenants supposed to be able to assert their rights under protections like the Fair Housing Act (FHA) and Fair Credit Reporting Act (FCRA) if they don’t know why or by what they were rejected?

The lack of transparency enables discrimination to go undetected, leaving renters to carry the burden of a broken system.

The problem with eviction records

Across states, applicants with an eviction record are 84% more likely to have their housing application denied than applicants without an eviction history. The well-documented racial bias in evictions leaves already-vulnerable renters at risk of unfairly being denied housing.

It is no surprise that our California survey found that white respondents were more than twice as likely to have their applications accepted relative to Black and Latinx respondents.

Smaller landlords use more AI

The Georgia and North Carolina survey results also confirmed the finding in California that landlords with 1-4 unit portfolios were 5.5% more likely to accept a screening recommendation without additional review than landlords with larger portfolios.

This is a huge issue as smaller landlords are commonly exempted from regulations in all three states, leaving applicants in a dangerous position: they have fewer protections than other renters and are more at the mercy of screening algorithms. 

What now?

After being rejected from an affordable housing opportunity, Mary Louis found another apartment—one whose management didn’t use algorithmic screening tech. They could see a more three-dimensional image of her as a prospective tenant. The catch? The apartment was more expensive, a bigger burden for someone already struggling to make ends meet.

She later found out that she wasn’t alone in this experience. She joined a lawsuit against the company that made the algorithmic tenant tool—along with more than 400 Black and Hispanic tenants who used housing vouchers and were denied housing. And they won.   

We can’t rely on lawsuits alone, though, to address this issue. After conducting surveys in two additional states and getting similar results from our California survey, we’re further convinced that we need to:

  1. Close information asymmetries between tenants, landlords, and screening companies. 
  2. Shift the burden of enforcing renter rights from individual renters to the landlords, screening companies, and regulators with the institutional power to do so effectively. 

Those recommendations drew in part from the 2024 guidance from HUD. The guidance clarifies the responsibilities of housing providers utilizing algorithmic tenant screening and screening service companies under the Fair Housing Act (FHA). With enforcement resources being scaled back at the federal level, we must enact and enforce protections at the state and local levels.

The Automated Decisions Safety Act (AB 1018) in California is a key example of the policies we need to combat the negative impacts of tech, such as tenant screening algorithms, on vulnerable communities.
Check out our full 2025 legislative agenda to learn more about what we’re doing to create a more just housing system—and a more just tech-powered economy overall.