How AI Hiring Bias May Be Costing Californians Their Dream Jobs

ai hiring bias

Technology is supposed to level the playing field, but when it comes to hiring, the opposite might be happening. In California, more and more employers are using artificial intelligence and automated decision systems to screen, evaluate, and select job candidates.

But these systems aren’t immune to the same biases that affect human hiring managers, and in some cases, they may make things worse.

If you’ve ever wondered why you never got a call back despite strong qualifications, or if you feel you weren’t even seen in the applicant pool, it’s possible that AI played a role, and you deserve to understand how, why, and what you can do about it.

What is AI Hiring Bias?

AI hiring bias happens when automated decision systems, the algorithms and artificial intelligence tools that screen job applications, discriminate against candidates based on protected characteristics.

These tools are everywhere now. Over 70% of companies use AI in their hiring process to screen resumes, rank candidates, conduct video interviews, and decide who gets moved forward.

The promise was simple: AI would make hiring faster, cheaper, and more objective. The reality is different.

Research from the University of Washington found that three major AI language models favored white-associated names 85% of the time and female-associated names only 11% of the time. The AI never favored Black male-associated names over white male-associated names, not once.

Real AI Hiring Bias Examples

AI discrimination isn’t theoretical. It’s happening right now.

The Workday Lawsuit

In 2024, Derek Mobley, a Black man over 40 with a disability, filed a class-action lawsuit against Workday, Inc., one of the largest HR software companies in the world.

Mobley claimed Workday’s AI screening software systematically discriminated against job applicants based on race, age, and disability.

In May 2025, a federal judge certified the case as a collective action, allowing it to proceed on behalf of potentially hundreds of thousands of applicants over 40 who were rejected by Workday’s system.

The court ruled that Workday could be held liable for discrimination even though it didn’t employ the applicants, because its AI participated in the hiring decision.

The iTutorGroup Settlement

In 2023, the EEOC settled its first-ever AI discrimination lawsuit. The online tutoring company iTutorGroup programmed its recruitment software to automatically reject female applicants aged 55+ and male applicants aged 60+.

Over 200 qualified individuals were disqualified based solely on age.

The company paid $365,000 to settle the case.

Amazon’s Scrapped Hiring Tool

In 2018, Amazon discovered its AI hiring tool discriminated against women applying for technical positions.

The AI had been trained on resumes from mostly male employees. It learned to prefer words like “executed” and “captured”, words men use more often in resumes.

Amazon scrapped the tool. But countless companies are still using similar systems.

HireVue’s Video Analysis

In March 2025, the ACLU filed a discrimination complaint against HireVue and Intuit on behalf of an Indigenous deaf job applicant.

The AI video interview tool evaluated her speech patterns, facial expressions, and “active listening.” She was rejected and told she needed to “practice active listening.”

The complaint alleges the AI was inaccessible to deaf applicants and likely performed worse when evaluating non-white applicants who speak dialects like Native American English.

How AI Discrimination in Hiring Actually Works

AI hiring bias happens in several ways.

Biased Training Data

AI learns from historical data. If that data reflects past discrimination, the AI will perpetuate it.

If a company historically hired mostly men for leadership roles, the AI will learn that men are “better fits” for leadership. If past hires were mostly white, the AI will favor white candidates.

The algorithm doesn’t understand context. It just sees patterns and replicates them.

Coded Discrimination

Some AI hiring tools analyze things that correlate with protected characteristics:

  • Speech patterns and accents (national origin discrimination)
  • Facial expressions and tone of voice (race and disability discrimination)
  • Typing speed or mouse movements (disability discrimination)
  • Zip codes in addresses (race and socioeconomic discrimination)
  • College names (socioeconomic and potentially racial discrimination)

None of these factors directly measure job performance. But they can screen out protected groups.

The “Ideal Candidate” Problem

AI systems trained on current employees will favor candidates who look like current employees.

If your company lacks diversity, your AI will perpetuate that lack of diversity.

This keeps out people with non-traditional backgrounds, unique perspectives, and valuable skills that don’t match the historical pattern.

California Law Prohibits AI and Discrimination

California doesn’t take AI discrimination lightly.

California Fair Employment and Housing Act (FEHA)

Under FEHA (Government Code Section 12940), employers with five or more employees cannot discriminate based on:

This applies whether the discrimination comes from a human or an algorithm.

California’s New AI Regulations

On October 1, 2025, California implemented groundbreaking regulations specifically targeting AI discrimination in employment.

These regulations make it clear: employers are liable for discrimination caused by their AI tools, even if they didn’t intend to discriminate.

The rules require:

  • Anti-Bias Testing: Employers must test their AI systems for discriminatory impacts
  • Record Retention: Companies must keep AI-related data for four years
  • Disparate Impact Liability: Even unintentional discrimination violates the law if the AI has a disparate impact on protected groups
  • Employer Responsibility: Employers can’t blame third-party vendors, they’re responsible for any AI discrimination

Federal Protections Still Apply

Title VII of the Civil Rights Act of 1964 prohibits employment discrimination based on race, color, religion, sex, and national origin.

The Age Discrimination in Employment Act (ADEA) protects workers 40 and older.

The Americans with Disabilities Act (ADA) prohibits disability discrimination.

These federal protections apply to AI hiring tools just as much as human decision-makers.

Signs You May Be a Victim of AI Hiring Bias

How do you know if AI discrimination is blocking your job search?

Pattern of Rapid Rejections

You apply to dozens or hundreds of jobs and get rejected within minutes or hours, before a human could have reviewed your application.

You’re Overqualified But Rejected

Your qualifications exceed the job requirements, but you’re consistently rejected at the screening stage.

Rejection After Video Interviews

You complete AI video interviews and get rejected despite having relevant experience and strong credentials.

Age-Related Timing

Your rejection rate suddenly increased after your age became visible on your resume (perhaps through graduation dates or years of experience).

Inconsistent Feedback

When you do get feedback, it’s vague, contradictory, or doesn’t match your qualifications.

What You Can Do About AI Discrimination

If you believe AI bias is costing you job opportunities, you have options.

Document Everything

Keep records of:

  • Every job application and when you applied
  • Rejection emails and timing
  • Job descriptions and your qualifications
  • Any correspondence with employers
  • Screenshots of your applications

This documentation can establish a pattern of discrimination.

Request Information

You have the right to know if AI was used in the hiring process. Some states require employers to disclose AI use.

Ask employers:

  • Did you use automated screening tools?
  • What criteria did the AI use?
  • Were accommodations available for the screening process?

File a Complaint

You can file a discrimination complaint with:

  • California Civil Rights Department (CRD): You have three years to file for violations of California law
  • EEOC: You typically have 300 days to file for federal violations

Talk to an Employment Lawyer

AI discrimination cases are complex. They involve technical evidence, statistical analysis, and novel legal theories.

An experienced employment attorney can:

  • Evaluate whether you have a strong case
  • Gather evidence and expert testimony
  • Navigate the legal process
  • Negotiate with employers
  • Represent you in litigation if necessary

Why Employers Can’t Blame the Algorithm

Here’s what employers often say: “We didn’t discriminate. The AI made the decision.”

California law rejects this excuse.

Under the new regulations, employers are responsible for discrimination caused by their AI tools.

They can’t outsource their legal obligations to a software vendor.

If an employer uses an AI hiring tool that discriminates, the employer is liable.

The Future of AI Hiring Bias Lawsuits

The Mobley v. Workday case is just the beginning.

Legal experts predict a wave of AI discrimination lawsuits as:

  • More workers realize AI is screening them out
  • State and local AI regulations continue expanding
  • Courts establish precedent on algorithmic discrimination
  • Technology makes it easier to detect patterns of bias

California is leading the charge with the strongest protections in the nation.

You Don’t Have to Accept Discrimination

If you’re qualified but consistently rejected, AI discrimination might be the reason.

You didn’t do anything wrong.

The algorithm failed. The employer failed to ensure their hiring process was fair.

California law is on your side.

Contact TONG LAW

If you believe you’ve been discriminated against by an AI hiring system, TONG LAW can help.

We represent employees exclusively, never employers. We understand California’s employment discrimination laws and the emerging legal landscape around AI and discrimination.

We serve clients throughout California, including Oakland, San Francisco, Sacramento, and the entire Bay Area.

Don’t let biased algorithms cost you the career you’ve worked for. Contact TONG LAW today for a confidential consultation about your AI discrimination case.

Author Bio

Vincent Tong

Vincent Tong is the CEO and Managing Partner of TONG LAW, a business and employment law firm located in Oakland, CA. Vincent is a fierce advocate for employees facing discrimination and wrongful termination. With several successful jury trial victories and favorable settlements, he has earned a strong reputation for delivering exceptional results for his clients.

In addition, Vincent provides invaluable counsel to businesses, guiding them on critical matters such as formation and governance, regulatory compliance, and protection of intellectual property assets. His depth of experience allows him to anticipate risks, devise strategies to avoid legal pitfalls, and empower clients to pursue their goals confidently.

Vincent currently serves as the 2021 President of the Board of Directors for the Alameda County Bar Association and sits on the Executive Board for the California Employment Lawyers Association. Recognized for outstanding skills and client dedication, he has consecutively earned the Super Lawyers’ Rising Star honor since 2015, reserved for the top 2.5% of attorneys. He also received the Distinguished Service Award for New Attorney from the Alameda County Bar Association in 2016. He is licensed to practice before all California state courts and the United States District Court for the Northern and Central Districts of California.

LinkedIn | State Bar Association | Super Lawyers | Google