How AI-powered tenant screening hurts renters

Automated tenant screening is already widespread, disproportionately impacting low-income and other vulnerable renters. We need to stop housing AI from deepening racial and economic inequities in the housing market.

Key takeaways

We surveyed renters and landlords to understand how automated tenant screening tech is used by landlords to screen renters in California—and how these opaque algorithms leave both renters and landlords in the dark. Here are our key takeaways.

AI-enabled tenant screening systems are widely used in the rental market

  • Almost two-thirds of the landlords we surveyed received tenant screening reports that contained some AI-generated score or recommendation.
  • Our survey indicates that landlords often rely heavily on the score alone to making housing decisions.

The use of Minority Report-esque predictive scoring for renters is prevalent

  • One in five landlords reported receiving predictive information from screening companies.
  • Predictive analysis is illegal in Europe; why are we letting algorithms decide how we’ll behave in the future?

Renters are often left in the dark, deepening power imbalances that threaten housing rights

  • Only 3% of renters knew the name of the screening or consumer reporting agency; 76% mistakenly said their landlord or property management company was the one screening them.
  • This raises the question: how can renters enforce their rights if they don’t know who is screening them and how they’re doing it?

AI tenant screening systems disproportionately impact the most vulnerable renters

  • We found that landlords who serve lower-income renters and smaller landlords used AI-enabled tenant screening the most.
  • Some of our most vulnerable renters are essentially being used as test subjects for AI screening. This raises questions about potential bias, exploitation, and further financial marginalization.

This survey backs up what we know to be true – the current housing market is steeped in racial bias

  • Black and Latinx renter survey respondents were almost half as likely to have their rental applications accepted as white respondents (46% and 43%, respectively).
  • This, combined with the fact that lower-income renters are disproportionately people of color, suggests that automated tenant screening has the potential to deepen racial bias in housing.

Download the paper below. Want to learn more about how tech is impacting renters? Check out our housing page for more resources.

Screened out of housing

Download the full paper here


Underline Image
Download