Hiring software Workday's AI may have an 'ageist' problem, company claims, 'they are not trained to…'

Hero Image
Hiring software provider Workday is facing a class action lawsuit that alleges its AI-powered job applicant screening system has an "ageist" problem. The lawsuit claims that Workday’s AI-powered system discriminates against candidates aged 40 and over. The lawsuit builds on an employment discrimination complaint filed last year by Derek Mobley against the company. Mobley's initial suit alleged that the company's algorithm-based system discriminated against applicants based on race, age, and disability. According to a report by Forbes, four more plaintiffs have now joined the lawsuit, specifically accusing Workday of age discrimination .


What Workday said about the lawsuit

In an email sent to Forbes, a Workday spokesperson denied allegations that their technology contains bias. The company said, “This lawsuit is without merit. Workday’s AI recruiting tools do not make hiring decisions, and our customers maintain full control and human oversight of their hiring process. Our AI capabilities look only at the qualifications listed in a candidate’s job application and compare them with the qualifications the employer has identified as needed for the job. They are not trained to use—or even identify—protected characteristics like race, age, or disability. The court has already dismissed all claims of intentional discrimination, and there’s no evidence that the technology results in harm to protected groups.”

The Workday spokesperson also noted that the company has recently implemented measures to ensure that the software they use adheres to ethical standards.


Hidden bias in AI hiring tools and why automation may not always be fair

As per data compiled by DemandSage, an estimated 87% of companies are reportedly using AI for recruitment in 2025. The report notes that these companies rely on tools like Workable, Bamboo HR, and Rippling for the same.

While these systems help automate hiring, a study by the University of Washington from last year revealed that they are often biased. AI tools can show traits of racial, gender, and socioeconomic biases from the data they’re trained on or from the algorithms themselves.

One of the examples is Amazon's scrapped AI tool that discriminated against women and resume filters that favour elite education or specific language patterns, often excluding underrepresented groups.