New ADS Rules Under FEHA: What San Francisco, Oakland, and other California Employees Must Know by October 1

automated decision systems employment law California

Starting October 1, 2025, California’s Fair Employment and Housing Act (FEHA) will enforce new regulations on Automated Decision Systems (ADS) used in employment decisions. These rules directly affect San Francisco, Oakland, and California employees by restricting how employers can use artificial intelligence and algorithmic tools in hiring, promotions, evaluations, and terminations.

If you work in San Francisco, Oakland, or anywhere in California, these regulations provide crucial new protections against algorithmic discrimination. The rules apply to virtually all employment-related AI tools, from resume screeners to performance evaluation software, and give you specific rights when your employer uses these technologies.

What Are Automated Decision Systems?

The new regulations define an Automated Decision System (ADS) as a “computational process that makes a decision or facilitates human decision making regarding an employment benefit.” This broad definition includes:

  • AI-powered resume scanning tools
  • Online assessments and “game-based” evaluations
  • Video interview analysis software
  • Performance monitoring systems
  • Automated ranking or scoring of employees
  • Targeted job advertisement tools
  • Facial expression and voice pattern analysis

These technologies can exist in many workplace systems, often without employees being fully aware they’re being evaluated by algorithms rather than humans.

Key FEHA Protections Against ADS Discrimination

Under the updated FEHA regulations, San Francisco, Oakland, and California employers are prohibited from:

  • Using ADS that discriminate against applicants or employees based on protected characteristics (race, gender, age, disability, etc.)
  • Implementing ADS that use “proxies” – seemingly neutral factors that correlate with protected characteristics (like zip codes serving as proxies for race)
  • Using automated screening tools that filter applicants based on schedule availability without accommodations for religious observances or disabilities
  • Deploying AI systems that analyze physical characteristics (facial features, voice patterns) in ways that could disadvantage protected groups

These prohibitions apply even if the discrimination is unintentional. Employers are responsible for ensuring their automated tools don’t produce discriminatory outcomes.

Expanded Legal Liability for Employers and Vendors

A significant change in the regulations is the expanded definition of an employer’s “agent.” Under the new rules:

  • Third-party vendors who provide ADS tools can be considered “agents” of your employer
  • Both your employer and the technology vendor can be held liable for discriminatory outcomes
  • Your employer remains responsible even when using outsourced AI hiring or evaluation tools

This means San Francisco, Oakland, and California employees can seek remedies from both their direct employer and the companies providing the discriminatory technology.

How the New Rules Protect Oakland Workers

For job applicants, the rules:

  • Prevent automated systems from screening you out based on schedule availability without accommodation options
  • Restrict tools that might disadvantage you based on disability, age, or other protected characteristics
  • Prohibit facial analysis technologies that could discriminate based on race, gender, or national origin

For current employees, the regulations:

These protections apply to all California employees, whether in San Francisco or Oakland, regardless of industry or position.

Your New Rights When Facing Algorithmic Bias

If you believe an employer’s ADS has discriminated against you, you have the right to:

  • File a complaint with the California Civil Rights Department (CRD) against both your employer and any third-party vendor providing the discriminatory ADS
  • Request accommodations if an ADS evaluates factors impacted by disability, religion, or other protected characteristics
  • Contest decisions made by automated systems, particularly if they appear to disadvantage protected groups
  • Access information about how the employer tests their systems for bias (in litigation)

These rights provide real mechanisms to challenge unfair algorithmic decision-making.

Employer Requirements and Potential Defenses

Under the regulations, employers using ADS must:

  • Retain records of all ADS data for at least four years, including data used to develop or customize the system
  • Conduct anti-bias testing to identify and mitigate potential discriminatory impacts
  • Provide accommodations for applicants and employees who may be disadvantaged by ADS due to disability, religion, or other protected characteristics
  • Allow human review of significant employment decisions rather than relying solely on automated processes

Importantly, the regulations create an affirmative defense for employers who can demonstrate they conducted thorough anti-bias testing and took steps to mitigate discriminatory impacts. Conversely, employers who fail to test their systems face increased legal vulnerability.

How to Recognize and Respond to Potential ADS Discrimination

You may be experiencing ADS discrimination if:

  • You’re repeatedly rejected from jobs despite meeting qualifications, particularly through automated systems
  • Performance evaluations suddenly decline after implementation of new software systems
  • Disciplinary actions seem to target employees with similar protected characteristics
  • Accommodations are denied because of “system requirements” or “algorithm results”
  • Promotion or compensation decisions follow patterns that disadvantage protected groups

If you notice these signs, document your experience thoroughly. Save emails, screenshots, performance reviews, and communications about the automated systems involved.

Steps to Take If You Suspect ADS Discrimination

If you believe you’ve experienced discrimination through an employer’s use of ADS:

  • Request information about the automated system used in your situation
  • Document everything related to the potentially discriminatory outcome
  • Report concerns to your employer’s HR department or management
  • File a complaint with the California Civil Rights Department (CRD) at https://calcivilrights.ca.gov/complaintprocess/
  • Consult an attorney specializing in employment discrimination

Remember that under California Labor Code Section 98.6, your employer cannot legally retaliate against you for asserting your rights under these regulations.

Frequently Asked Questions

Do these regulations apply to small businesses in Oakland?

Yes. The FEHA regulations apply to all California employers with five or more employees, regardless of size or industry. If your Oakland employer uses any form of automated decision-making in employment, these rules apply.

Can my employer still use AI tools for hiring or evaluations?

Yes, employers can still use these tools, but they must ensure the systems don’t discriminate against protected groups, conduct anti-bias testing, maintain records, and provide accommodations when needed.

What types of algorithms are covered under these regulations?

The regulations cover virtually all computational processes used in employment decisions, including AI, machine learning, statistical models, and other data processing techniques that impact hiring, evaluations, promotions, or terminations.

If my employer uses a third-party AI platform for hiring, who’s responsible for discrimination?

Both your employer and the technology provider can be held liable under the expanded definition of “agent” in the regulations. This prevents employers from deflecting responsibility to vendors.

Protecting Your Rights Under the New Regulations

As AI becomes more prevalent in workplace decisions, these new FEHA regulations offer crucial protections for Oakland employees against algorithmic discrimination.

If you’re concerned about potential discrimination through automated systems at your workplace, TONG LAW can help. Our experienced employment attorneys specialize in helping Oakland workers facing all forms of discrimination, including through algorithmic systems. Contact us if you believe your rights have been violated through an employer’s use of automated decision systems.

This blog post provides general information for educational purposes only and should not be construed as legal advice. Every employment situation is unique. Please contact an attorney for advice specific to your circumstances.

Author Bio

Vincent Tong

Vincent Tong is the CEO and Managing Partner of TONG LAW, a business and employment law firm located in Oakland, CA. Vincent is a fierce advocate for employees facing discrimination and wrongful termination. With several successful jury trial victories and favorable settlements, he has earned a strong reputation for delivering exceptional results for his clients.

In addition, Vincent provides invaluable counsel to businesses, guiding them on critical matters such as formation and governance, regulatory compliance, and protection of intellectual property assets. His depth of experience allows him to anticipate risks, devise strategies to avoid legal pitfalls, and empower clients to pursue their goals confidently.

Vincent currently serves as the 2021 President of the Board of Directors for the Alameda County Bar Association and sits on the Executive Board for the California Employment Lawyers Association. Recognized for outstanding skills and client dedication, he has consecutively earned the Super Lawyers’ Rising Star honor since 2015, reserved for the top 2.5% of attorneys. He also received the Distinguished Service Award for New Attorney from the Alameda County Bar Association in 2016. He is licensed to practice before all California state courts and the United States District Court for the Northern and Central Districts of California.

LinkedIn | State Bar Association | Super Lawyers | Google