Workday hit with claims its AI hiring systems are discriminatory

A hand holds a magnifying glass up against a model of a black employee, while two white employees flank outside the view of the lens against a white background
(Image credit: Shutterstock)

Enterprise cloud application firm Workday is being sued over alleged racial and other biases in its AI applicant screening tool.

The class action complaint alleges that the tool contains algorithmic discrimination against applicants who are African American, over the age of 40, and those with disabilities.

It also includes suggestions that Workday's AI is developed by humans who often have built-in biases of their own, conscious or unconscious, and is therefore likely to be discriminatory as a result.

Plaintiff Derek Mobley, an African American male over the age of 40, who also suffers from depression and anxiety, filed the document in a California district court.

He stated that since 2018, he has been rejected in 80-100 job applications to companies that he believes utilise Workday’s screening tool for hiring purposes.

“Workday, Inc. unlawfully offers an algorithm-based applicant screening system that determines whether an employer should accept or reject an application for employment based on the individual’s race, age, and or disability,” the complaint [PDF] read.

The complaint made no distinction between the use of Workday’s systems by individual firms as the systems in question are used by a range of companies. The complaint centres around the allegation that HR teams have used Workday’s systems to pre-select applicants in discriminatory ways and that, as a result, the plaintiff opposes the usage of the software and not the intricacies of its implementation at respective companies.

“We believe this lawsuit is without merit,” a Workday spokesperson told IT Pro.

"At Workday, we are committed to trustworthy AI and act responsibly and transparently in the design and delivery of our AI solutions to support equitable recommendations.

"We engage in a risk-based review process throughout our product lifecycle to help mitigate any unintended consequences, as well as extensive legal reviews to help ensure compliance with regulations.”

Mobley contends that Workday has violated laws including the Title VII of the Civil Rights Act of 1964, the Civil Rights Act of 1866, the Age Discrimination in Employment Act of 1967, and the ADA Amendments Act of 2008.


Solve global challenges with machine learning

Tackling our world's hardest problems with ML


Without careful consideration and oversight of training, machine learning models and AI can contain algorithmic bias.

Another example of this being discovered in hiring is Amazon’s failed attempt to automate its recruitment process, which resulted in a system that discriminated against female applicants.

The model had been trained on a decade of Amazon hiring data, which revealed a tendency to hire men over women.

As a result, the model favoured male applicants and used evidence that applicants were women within their CVs such as references to all-women colleges as points to discount them from being hired. Amazon retired the system without ever having put it into use outside of simulated tests.

Rory Bathgate
Features and Multimedia Editor

Rory Bathgate is Features and Multimedia Editor at ITPro, overseeing all in-depth content and case studies. He can also be found co-hosting the ITPro Podcast with Jane McCallion, swapping a keyboard for a microphone to discuss the latest learnings with thought leaders from across the tech sector.

In his free time, Rory enjoys photography, video editing, and good science fiction. After graduating from the University of Kent with a BA in English and American Literature, Rory undertook an MA in Eighteenth-Century Studies at King’s College London. He joined ITPro in 2022 as a graduate, following four years in student journalism. You can contact Rory at or on LinkedIn.