pixel pixel

A class-action lawsuit is a warning shot on one AI risk for businesses

A class-action lawsuit targeting HR and finance platform Workday Inc. may end up including millions of potential victims — and it offers a cautionary tale — and some important lessons — for companies that use artificial-intelligence tools in their hiring decisions.

 

The lawsuit, brought by Derek Mobley in the Northern District of California, claims that Workday’s AI-based applicant-recommendation system scored and then discriminated against job applicants on the basis of race, age and disability. 

 

Mobley, who is over 40, along with other named plaintiffs, said they applied for hundreds of jobs with companies using Workday tools only to get silence in return. The lawsuit alleges that Workday’s tools created scores for each applicant and flagged for customers those candidates they should hire — contending that Workday acted as an “agent” of their customers.

 

Workday, in court documents, has denied the allegations and requested the lawsuit be dismissed. Workday has disputed that it offers “employment recommendations” at all to customers using its software and thus is not the one at fault for potential discrimination. Workday also said in court documents that “1.1 billion applications were rejected using Workday” during the time period at issue and “any notice would still invite potentially hundreds of millions of potential plaintiffs to file their claims in this case.”

 

Connor Spielmaker, principal, corporate communications for Workday, said in a statement that the lawsuit was without merit. 

 

“Workday’s AI recruiting tools do not make hiring decisions or automatically reject candidates— hiring decisions are always made by our customers, who maintain full control and human oversight,” Spielmaker said. “These tools look only at the qualifications listed in a candidate’s job application and compare them with the qualifications the employer has identified as needed for the job.”

 

He said the programs are not trained to use, or even identify, protected characteristics like age, race or disability and that the court has already dismissed all claims of intentional discrimination — with no evidence that the technology results in harm to protected groups.

 

In May, a judge granted the lawsuit class-action status. The order granting class-action status stressed that Workday used the term “recommendation” on both its website and in documents provided early in the discovery process.

 

Workday has raised issues with whether there is a true “class” for the lawsuit, since there are so many factors involved in job applications, and how that class could be measured. The judge said those issues could be worked out in the course of the litigation.

Companies adjust hiring practices

The lawsuit creates a fundamental compliance challenge for businesses similar to lawsuits from “employment testing” from the 1970s to the 1990s. The Supreme Court, in Griggs v. Duke Power Co. and Albemarle Paper Co. v. Moody, ultimately ruled in 1971 that employment practices, even if neutral on their face, could be unlawful if they disproportionately harm specific groups. Employers found they could not simply rely on professional-developed tests without validating them for job-relatedness and disparate impact, and companies today remain fully liable for discrimination when using AI hiring tools.

 

“But most AI tools operate as ‘black boxes’ that make required bias testing nearly impossible to conduct,” said Brian Patterson, partner in the Houston office of law firm Bracewell LLP, in an email. “Employers will need to treat AI hiring tools like any other selection criteria and obtain confidence from AI vendors that these tools do not have disparate impact on protected groups.”

 

AI tools have grown tremendously in the few years since OpenAI’s ChatGPT took the world by storm in 2022. Many workers use chatbots for career and job advice, while many managers use ChatGPT and other tools to determine which workers should get raises and promotions. Experts, however, previously have warned about potential litigation when it comes to use of AI tools, including “hallucinations” that sometimes lead to a generative AI tool giving incorrect answers.

 

The Workday case could slow AI adoption among employers for certain uses until they become more comfortable that AI vendors have properly validated their systems, similar to how companies pivoted from use of certain cognitive and personality tests.

 

AI can still be used for administrative purposes in hiring — such as for parsing resume data into standardized formats, organizing material or scheduling interviews, since those actions are not involved in making decisions about individual candidates. But the moment AI is used to screen, test, rank or make any kind of recommendation, it becomes a selection tool that must be validated, Patterson said.

How AI is put to use

Sheldon Arora, CEO of StaffDNA, said the lawsuit shows that AI is only part of the solution in hiring and not the entire answer. It also underscores how there needs to be human supervision at every step along the way.

 

What AI can do right, Arora said, is match the right candidates to the right jobs, but if there are filters present based on what the company may prefer, the match will not be optimal and potentially could exclude candidates who do not fit a narrow criteria.

 

“This is where human decision-making and judgment comes into play. There’s a need to look beyond the data and see the nuances of a job candidate’s background and skills. Since humans program AI, it’s critical to have continuous oversight and minimize bias,” Arora said. “Employers should not be implementing AI in the hiring process and let it go completely unchecked.”

 

David Wimmer, attorney and partner at law firm Swerdlow Florence Sanchez Swerdlow & Wimmer, said there is no proof to date that Workday’s AI systems discriminated. The case likely will lead to a “battle of the experts” as each side tries to convince a jury of its argument.

 

The certification of the class-action status and a judge’s order to Workday to produce the names of its employer clients is a “huge inflection point” for the company, Wimmer said. Now the firm has to make a choice about how far to take this.

 

“Do they settle this case now to avoid having to go down that path of having to reveal their clients? Of having to send out notices to how many hundreds of millions of people? Do they say now is the time to cut our losses?” Wimmer said. 

 

It’s not just Workday that faces potential liability. It’s all of the companies that used Workday’s systems, Wimmer said. If a jury finds there was disparate impact, that could expose those customers too.

 

What companies can and should do is make sure any agreement with a software vendor comes with defense and indemnification provisions and assurances from the vendor that the systems have been tested for disparate impact. Companies also need to do their own diligence, Wimmer said.

 

“Companies should be testing it,” he said. “Are they running through and checking, manually, some of the folks who have been passed on through, or some of the folks that have been passed on? Are they double checking to see if this makes sense?”

The Workday case is not the only lawsuit regarding AI in hiring. Additionally, some states have passed new regulations expressly prohibiting AI for automated employment decisions.

 

In the case of a California regulation, which goes into effect Oct. 1, companies will be required to retain records along with the regulation placing limitations on how AI systems in hiring can be used.

 

“That’s just going to lead to a whole new explosion of litigation,” Wimmer said.

Lisa Dawson

PR and Communications

Healthcare organizations face some of the toughest workforce challenges: tight budgets, lean IT teams and limited tools for sourcing, hiring and onboarding staff. Add in manual scheduling, rising labor costs and high burnout, and the pressure grows. Rolling out complex systems can feel out of reach without dedicated tech support. Even simply evaluating new technology can overwhelm already stretched-thin teams.

These challenges make it clear that technology isn’t just helpful; it’s essential for healthcare organizations. Especially when they’re striving to do more with less. Not only are healthcare organizations falling short on implementing new technology, but they’re struggling to update outdated systems. A 2023 CHIME survey found that nearly 60% of hospitals use core IT systems, such as EHRs and workforce platforms, that are over a decade old. Outdated tools can’t integrate or scale, creating barriers to smarter staffing strategies. But the opportunity to modernize is real and urgent.

Tech in Patient Care Falls Short

In healthcare, technology has historically focused on clinical and patient care. Workforce management tools have taken a back seat to updating patient care systems. Yet many big tech companies have failed when it comes to customizing healthcare infrastructure and connecting patients with providers. Google Health shuttered after only three years, and Amazon’s Haven Health was intended to disrupt healthcare and health insurance but disbanded three years later.

Why the failures? It’s estimated that nearly 80% of patient data technology systems must use to create alignment is unstructured and trapped in data silos. Integration issues naturally form when there’s a lack of cohesive data that systems can share and use. Privacy considerations surrounding patient data are a challenge, as well. Across the healthcare continuum, federal and state healthcare data laws hinder how seamlessly technology can integrate with existing systems.

Why Smarter Staffing Is Now Essential

These data and integration challenges also hinder a healthcare organization’s ability to hire and deploy staff, an urgent healthcare priority. The U.S. will face a shortfall of over 3.2 million healthcare workers by 2026. At the same time, aging populations and rising chronic conditions are straining teams already stretched thin.

Smart workforce technology is becoming not just helpful, but essential. It allows organizations to move from reactive staffing to proactive workforce planning that can adapt to real-world care demands.

Global Inspiration: Japan’s AI-Driven Workforce Model

Healthcare staffing shortages aren’t just a U.S. problem. So, how are other countries addressing this issue? Countries like Japan are demonstrating what’s possible when technology is utilized not just to supplement staff, but to transform the entire workforce model. With one of the world’s oldest populations and a significant clinician shortage, Japan has adopted a proactive approach through its Healthcare AI and Robotics Center, where several institutions like Waseda University and Tokyo’s Cancer Institute Hospital are focusing on developing AI-powered hospitals.

Japan’s focus on integrating predictive analytics, robotics and data-driven scheduling across elder care and hospital systems is a response to its aging population and workforce shortages. From robotic assistants to AI-supported shift planning, Japan’s futuristic model proves that holistic tech integration, not piecemeal upgrades, creates sustainable staffing frameworks.

Rather than treating workforce tech as an IT patch for broken systems, Japan’s approach embeds these tools throughout care operations, supporting scheduling, monitoring, compliance and even direct caregiving tasks. U.S. health systems can draw critical lessons here: strategic investment in integrated platforms builds resilience, especially in a labor-constrained future.

The Power of Smart Workforce Technology

In the U.S., workforce management is becoming increasingly seen as more than a back-office function; it’s a strategic business operation directly impacting clinical outcomes and patient satisfaction. Smart technology tools are designed to improve care quality, staff satisfaction, scheduling, pay rates, compliance and much more.

For example, by using historical data, patient acuity, seasonal trends and other data points, organizations can predict their staff needs more accurately. The result is fewer gaps in scheduling, fewer overtime payouts and a flexible schedule for staff. AI-powered analytics can help healthcare leadership teams spot patterns in absenteeism, see productivity and forecast needs in multiple clinical areas in real-time. Workforce management tools can help plan scheduling proactively, rather than reactively. It’s a proven technology tool that can help drive efficiency and reduce costs.

Why So Many Are Still Behind

Despite the clear benefits, many healthcare organizations are slow to adopt smart tools that empower their workforce. Several things are holding them back from going all-in on technology:

Financial Pressures

Over half of U.S. hospitals are operating at or below break-even margins. For them, investing in new technology solutions is financially unfeasible. Scalable, subscription-based and even free workforce management tools are available, but most organizations are unaware of or lack the resources to source these products. Workforce management tools can deliver long-term return on investment for most organizations. Taking the time to understand where the value lies and which tools to invest in needs to happen.

Outdated Core Systems

Many facilities still depend on legacy technology infrastructure that lacks real-time capabilities. Many large players in the healthcare workforce management industry dominate hospital systems. Other smaller, real-time tools that offer innovative solutions to scheduling, workforce hiring, rate calculators and more are available at a fraction of the cost.

Competing Priorities and Strategic Blind Spots

Healthcare organizations and hospitals have many high-priority business objectives and regulatory demands. Digital transformation naturally falls down on the priority list, which causes them to miss improvements that can lead to long-term stability. With patient care and provider satisfaction at the top of the priority mountain, technology changes can be easily missed or shoved to the side when other business objectives are perceived to “move the needle” more.

Poor Change Management

Even the best technology efforts can fail without the right strategy for adoption and support from senior leadership. Resistance from staff, lack of training, or poor rollout communication can undermine success. Effective change management—clear leadership, role-based training and feedback loops—is essential.

Faster than the speed of technology

Change needs to come quickly to healthcare organizations in terms of managing their workforce efficiently. Smart technologies like predictive analytics, AI-assisted scheduling and mobile platforms will define this next era. These tools don’t just optimize operations but empower workers and elevate care quality.

Slow technology adoption continues to hold back the full potential of the healthcare ecosystem. Japan again offers a clear example: they had one of the slowest adoption rates of remote workers (19% of companies offered remote work) in 2019. Within just three weeks of the crisis, their remote work population doubled (49%), proving that technological transformation can happen fast when urgency strikes. The lesson is clear: healthcare organizations need to modernize faster for the sake of their workforce and the patients who rely on providers to deliver care.

 

Share On

Facebook
LinkedIn
WhatsApp
X
Email

Check out StaffDNA Insights