The state of Californian data privacy and algorithmic decision-making

Our personal data is increasingly being used to make decisions about our lives—from housing to our workplaces—often without our knowledge. Californians sought to safeguard that data when they voted in favor of an initiative that passed the California Privacy Rights Act (CPRA), which also created the California Privacy Protection Agency (CPPA).
The CPPA, though, wasn’t meant only to protect the collection of data for the sake of privacy. It was also meant to address the use of our data in technology, such as automated decision-making tools (ADMTs, also known as Automated Decision Systems).
After years of drafting, debate, and public comments, the CPPA has now voted unanimously to adopt regulations on ADMTs. Unfortunately, the approved regulations were weakened throughout the two-year process and do not provide the level of protection that many Californians may believe they voted for with the CPRA.
Throughout this process, TechEquity has weighed in consistently alongside labor and civil society groups—which represent hundreds of thousands of workers and consumers—to try to ensure that these regulations deliver real protections that address the current and future harms posed by ADMTs. The approved regulations fall short, but alongside our partners and communities, we are committed to continuing the necessary work to ensure that California is a leader in addressing economic equity issues resulting from the tech industry’s products and practices.
You can read/watch all of our statements related to these regulations here:
- June 2025 – Joint letter with the UC Berkeley Labor Center
- April 2025 – Joint letter to the CPPA
- January 2025 – Joint letter with the UC Berkeley Labor Center
- January 2025 – Comments to the CPPA
- November 2024 – Comments to the CPPA
- May 2024 – Comments to the CPPA
- March 2024 – Comments to the CPPA
- March 2024 – Joint letter with the UC Berkeley Labor Center
Below are the comments our Chief Program Officer, Samantha Gordon, provided at the CPPA Board Meeting on July 24, 2025.
Samantha Gordon’s Comment
Good afternoon. My name is Samantha Gordon with TechEquity.
We, along with a large number of labor and consumer advocacy groups, have consistently participated in the rule-making process in good faith. We have gathered evidence from workers and consumers; summarized academic research; and appeared several times before this body with painstaking analysis of each version of the draft regulations and their potential impacts—as a non-profit, this is not an inconsequential expenditure of time and resources for us. It reflects our belief that the work of the agency, its staff, and the impact of these regulations is deeply important.
Despite all of this, we’ve watched as our recommendations were largely ignored and these regulations have grown weaker and weaker—protecting fewer and fewer Californians with every successive round of industry lobbying.
As a result, the proposed regulations before you today fail to deliver on the promise to California voters who created this Agency and instead leave us largely unprotected from the growing use of these systems and the potential harms that come with them.
We are deeply concerned about the extremely narrow definition of ADMTs. Under this definition, a company could decide that its system does not “substantially replace” human decisions and is not covered by these regulations.
For example, a national health insurer reportedly implemented an ADMT that used personal data to analyze insurance claims, and doctors are supposed to review them before making the final decision to deny or approve payment for patient care. Under this definition, we assume companies would not consider this use as covered by these regs. However, an investigation found these doctors only spent an average of 1.2 seconds to review each health insurance claim. Under the current regs, businesses will point to human involvement to exempt them from providing patients proper notice and explanation—even when that business’ use of their personal data creates significant risks to the consumer of an inaccurate decision.
As the CPPA has acknowledged, this choice to weaken the definitions allows almost all companies to avoid accountability, exempting 90% of entities covered by this agency.
And what’s more: even in the slim number of cases in which these draft regulations would apply, affected individuals who have been denied a job, an apartment, or healthcare will likely not know about it because of the weak language around use notifications. As a result, impacted Californians will not be able to exercise their right to access data related to these outcomes that can have life-changing consequences for them.
We’re disappointed in where these regulations stand and urge you to reconsider the definitions, the use notice, and whose interests these regulations are supposed to serve—everyday Californians or Silicon Valley executives?
Thank you to the CPPA director, staff, and board for your important work.
What next?
While it’s disheartening that the CPPA ultimately passed the weakened ADS regulations, the CPPA isn’t our only avenue for regulating the use of ADMT/ADS in California.
Right now, we’re co-sponsoring the Automated Decisions Safety Act (AB 1018), which would ensure that ADS are vetted, that everyday people understand how these decisions impacting them are made, and that people know what to do if they suspect discrimination.
Have you been on the receiving end of an algorithm-driven decision that you believe was biased or discriminatory? We want to hear your story. Fill out the form below, and one of our organizers will follow up with you shortly.
Share your story about automated decisions
Have you been on the receiving end of an AI-driven decision that you believe was biased or discriminatory? We want to hear your story.
"*" indicates required fields