What you need to know about automated discrimination

And why we need to pass AB 1018, the Automated Decisions Safety Act, in California.
March 26, 2025

Imagine, for a moment, that you’re a single parent on the outskirts of Pittsburgh, Pennsylvania. You’ve just come home from your second job when you hear a knock at the door. Child Protective Services is on your doorstep, checking in on you and your three kids for evidence of neglect. No concerned neighbors called in; your kids are fed and well-loved. So why is a social worker suspecting you of neglect?

One of your kids is disabled, and you use food stamps to make ends meet. Because of that and other factors, an algorithm has decided that you and your family are at high risk for child abuse.

This isn’t just hypothetical. In Allegheny County, PA, the Allegheny Family Screening tool provides recommendations to social workers to inform which families should be investigated for potential neglect. The tool uses data like race, zip code, disability, and use of public welfare benefits to flag potentially neglectful parents, disproportionately recommending that parents with low incomes need investigation. What’s more, this tool was found to have racial bias and bias against families with disabled parents or children.

Life decisions, brought to you via algorithm

Algorithms are embedded in every facet of daily life, making decisions about us all within opaque software “solutions.”

Employers use AI-backed hiring tools with biases against women and Black applicants. Landlords use algorithmic tools that screen out people who use housing vouchers, i.e. low-income people. People with disabilities are seeing their healthcare cut because of bias in decision-making models.

Automated Decision Systems (also known as ADS) make more and more of these life-changing decisions every day with little transparency or regulatory oversight. Left unchecked, these algorithms can produce biased results, making everyday people vulnerable to automated discrimination without them even knowing about it.

We need to bring protections for Californians up to speed with today’s technologies and how they impact our lives. The Automated Decision Safety Act (AB 1018) will do just that.

But what is an Automated Decision System?

Automated Decision Systems explained

Broadly, Automated Decision Systems are tools that take data and put it through an algorithm. Then, based on that data—or information that the algorithm “inferred” from the data—the system generates a recommendation, a score, or a decision.

This includes data garnered from publicly unknown sources, inaccurate data, and data people didn’t consent to being collected.

These systems typically use some form of artificial intelligence in their process. The algorithms in ADS tools are trained on previous data, “learning” what to look for in datasets and how to factor different data points to determine the output.

The problem is that every step of this process introduces potential bias—whether or not people investigate the ADS’s output. Because the whole purpose of these systems is to automate processes, it’s rare that people interrogate the decision before harm is already done.

Even when harm is done—someone isn’t allowed bail or gets their kids taken away—it’s hard to know for certain that it was due to ADS bias and take action accordingly since there’s no transparency.

Automated Decision Systems are already impacting people

In California, the city of Los Angeles uses an ADS called VI‑SPDAT to determine which unhoused people have the highest need for housing—but investigators have discovered that this tool is also rife with racial bias. The tool favored white applicants for affordable housing, scoring 67% of unhoused white young adults into the highest priority group, compared with 56% of Latino young adults and 46% of Black young adults. 

And that’s just in our government; there are thousands of companies using automated decision systems to determine who will be a trustworthy renter, a suitable candidate for a job, and more.

What can we do about this

Systems that shape people’s futures must follow clear rules, undergo rigorous testing, and have accountability to avoid harm and ensure fairness. Too often, people flick on autopilot and walk away from the wheel, leaving these tools to steer decisions without oversight—regardless of the risk they pose to the people in their paths.

The worst part is that Californians are left in the dark about how algorithms make these life-changing decisions. We have a right to know how tech is being used to evaluate us—and we have a right to speak out if we believe those decisions are wrong.

By ensuring that tech companies follow regulatory blueprints to prevent discrimination and provide clear explanations to the public on how they use these tools, Californians can trust that these life-changing decisions aren’t being made lightly. 

Together, we can construct a future where fairness and trust are the foundation of every system. The Automated Decision Safety Act will be the blueprint for that future, and we’re working with SEIU and Assemblymember Bauer-Kahan to make that blueprint a reality.

How you can help

Have you been on the receiving end of an AI-driven decision that you believe was biased or discriminatory? We want to hear your story. Fill out the form below, and one of our organizers will follow up with you shortly.

Share your story

  • Not required, only share if you’d like
  • Not required, only share if you’d like
  • Share your experience as a contract worker, with as little or as much detail as you feel comfortable with.
  • We’re collecting these stories to share with legislators, the press, and the public. If you wish to remain anonymous, we’ll protect your privacy. Feel free to reach out to info@techequity.us or add a note here if you have questions about your privacy.
  • Drop files here or
    Max. file size: 100 MB.
      Share a video of you sharing your story, screenshots, a written document, anything you’d like to add to better tell your story.
    • This field is for validation purposes and should be left unchanged.