Unpacking HUD’s new guidance on algorithmic tenant screening
On May 2nd, the Department of Housing and Urban Development’s (HUD) Office of Fair Housing and Equal Opportunity issued Guidance on the Application of the Fair Housing Act to the Screening of Applicants for Rental Housing. The guidance explains how the Fair Housing Act protects renters against discrimination in tenant screening, including against automated discrimination through algorithms. It also illustrates how landlords and companies offering screening services can employ this technology in a non-discriminatory manner.
Hannah Holloway, Senior Director of Housing Programs at TechEquity, said:
“We are encouraged by HUD’s guidance and leadership on tenant screening practices. The increasing use of AI and technology in the rental screening process is a big concern at TechEquity, and this latest guidance brings attention to the need for greater transparency, validation, disclosure, and enforcement against third-party screening companies.
We commend HUD’s efforts to clarify the responsibilities of housing providers and screening companies to ensure that the use of this technology does not lead to discriminatory housing decisions that violate the Fair Housing Act.”
Traditional vs. Algorithmic Tenant Screening
The tenant screening industry is vast and profitable, encompassing as many as 2,000 companies and $1.3 billion in annual revenue. In 2022, TechEquity landscaped the rental screening industry but found that many of the specifics of how tech-enabled screening companies operate are hidden behind proprietary products and claims of trade secrets.
Most are familiar with applying for private rental housing: you submit an application with basic personal information, proof of income, references, and consent for the landlord to pull various personal reports. The reports can include credit information, criminal arrest and conviction history, and previous eviction cases. In the traditional screening method, a landlord would then independently review the information returned in those reports and determine whether to approve or deny the housing application.
Housing AI has disrupted the traditional system such that landlords receive not just the unprocessed reports but also the third-party company’s own analysis of the reports. Some combine an individual’s background reports with larger datasets to predict the individual’s likelihood of being a stable tenant in the future. Others merely assess the background reports to create their own proprietary analysis of a tenant’s application. There are different end-products, but tech-enabled screening outsources the assessment of various reports to companies who offer simplified risk scores or rental recommendations that flatten tenant profiles into metrics or instructional graphics.
Why We Need to Regulate Algorithmic Tenant Screening
Tenants do not have clear information about how landlords and screening companies work together to make rental decisions, making it difficult to understand if their rights and protections are upheld. Discriminatory screening outcomes—and who is really making housing decisions, landlords or hidden companies—are a concern that tenants and advocates have raised for years.
The guidance focuses on the disparate impact that tenant screening practices have on tenants based on protected classes such as race and familial status. To counteract the role of tenant screening in housing discrimination, HUD calls on housing providers to:
- Choose only relevant screening criteria, publish their screening policies in advance, and use screening companies that assess applicants based only on the provider’s stated standards
- Apply discretion to the third-party screening results
- Give applicants the chance to contest a negative determination
- Ensure screening companies use accurate and non-discriminatory models
- Provide all records provided by screening companies to the applicants, as well as details on why an applicant was denied and which applicant characteristics were insufficient for approval.
TechEquity’s forthcoming research into screening practices reveals that over one-third of housing providers rely on opaque, third-party screening analysis. At the same time, analysis of tenant responses reveals a statistically significant denial rate for Black and Latinx applicants even controlling for all other factors such as income and rental amount.
There is a clear need to put guardrails on rental screening technology to ensure a just housing system. While the guidance represents an important step forward in safeguarding housing rights, it isn’t currently more than a suggestion to housing providers. We must collectively push to ensure that the best practices in this document become requirements for both housing providers and the housing tech companies behind the algorithms.
As the reliance on this technology expands, we will continue to advocate for structural changes that reflect how technology impacts our livelihoods. This guidance serves as an actionable framework for creating and codifying safeguards around the role of tenant screening technology in housing decisions. We look forward to what this guidance means for enacting tenant protections that meet the needs of today’s housing system.
You can read the full guidance here.