To the Federal Trade Commission and Consumer Financial Protection Bureau: What you need to know about algorithmic tenant screening
Ms. Jones* was seeking to rent an apartment for herself and her three children using her housing choice voucher. On paper, she was an attractive tenant for many landlords: she had a credit score of 629, was employed, did not have any criminal history, and did not have any adverse rental history. Nonetheless, Ms. Jones was denied housing by six different apartment complexes. This meant that each time, she had to pay a nonrefundable application fee only to end up without an apartment.
Ms. Jones’ final denial letter stated that she was rejected because of a tenant screening report from SafeRent Solutions and provided her their contact information. Unfortunately, she was unable to request her report and contest any inaccuracies. To request her report from SafeRent, Ms. Jones would have had to fill out their burdensome request form, which asks for all addresses where she has resided in the past seven years, a copy of her state ID, and a recent tax or utility bill. Ms. Jones was unable to attach her voucher within the 120 days she was allotted, even after receiving two extensions. She lost her voucher and is still searching for housing.
Ms. Jones’s experience is far from unique. The housing industry has a long history of pervasive discrimination that disproportionately impacts marginalized groups. What’s more, emerging technology like algorithmic tenant screening has the potential to encode that discrimination for years to come.
Earlier this year, the Federal Trade Commission and Consumer Financial Protection Bureau called for experts to share their knowledge on the tenant screening industry. We partnered with HOME of Virginia to illuminate the role that algorithms are increasingly playing in the space—and how they compound discrimination against our most vulnerable renters.
*Ms. Jones is a client of HOME of Virginia. Her name has been changed to preserve her anonymity.
Highlights from our Comments
We combined HOME of Virginia’s first-hand experience counseling tenants with key takeaways from our Tech, Bias, and Housing Initiative research. Our comment covers:
- How algorithms are currently being used in the tenant screening process.
- How these algorithmically-derived tenant screening reports and scores mask data integrity issues, prevent landlords from reviewing an applicant’s underlying records, and provide little explanation to denied applicants.
- What regulators can do to address these harms, improve transparency, and provide recourse for affected tenants.
You can read the full comment here:
Share Your Story
Do you have rental housing application experiences that you believe were impacted by technology? We want to hear from you!
Share your story
Tell us about your experience with housing and technology