Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus convallis sem tellus, vitae egestas felis vestibule ut.

Error message details.

Reuse Permissions

Request permission to republish or redistribute SHRM content and materials.

AI Bias Audits Are Coming. Are You Ready?

A core requirement in a recently enacted New York City law governing the use of AI in employment decisions promises to impact how HR leaders choose, contract with, and oversee technology vendors for years to come.

The law, which took effect July 5, 2023, requires employers in New York City to conduct an annual third-party AI “bias audit” of technology platforms they use for hiring or promotion decisions and to publish the audit findings on their websites. The audits are designed to ensure AI tools used in such systems don’t discriminate on the basis of race or gender.

While the new law covers only one jurisdiction, legal experts and HR technology analysts say it’s only a matter of time before other states and jurisdictions enact similar—if not more sweeping—legislation that will include stipulations to conduct AI bias audits. Some attorneys believe, for example, that future laws may require audits for potential age and disability bias, not just the more-narrow gender and race discrimination covered by Local Law 144 in New York City.

The result is HR functions will need to take a proactive role in ensuring both internally developed AI tools and any AI used in vendor systems for employment decisions are regularly audited by qualified third-party auditors for bias. The stakes are high, since HR can ultimately be held responsible if regulators find the AI tools it uses for hiring or promotion decisions are discriminatory—even if those tools are part of technology vendors’ platforms.

The U.S. Equal Employment Opportunity Commission (EEOC) recently provided guidance on the legal responsibility employers have when bias is found in their vendors’ AI tools.

Just as 401(k) plans are regularly audited to ensure they meet IRS regulations and computer systems are audited for cybersecurity reasons, so too should HR leaders now expect that the AI they use for employment decisions will be scrutinized in similar fashion.

SHRM Online went behind the scenes with two organizations that recently underwent AI audits and also interviewed experts on what to expect from a bias audit, how to choose a credible third-party auditor and what the future of AI auditing holds for HR functions.

Benefits of AI Bias Audits

While most HR technology vendors conduct third-party AI audits in response to current or pending legislation, some also believe it’s simply good business practice and presents an opportunity—through lessons taught by auditors—to educate themselves on avoiding AI-based discrimination in the future.

Sam Shaddox, vice president of legal for SeekOut, a talent intelligence platform in Bellevue, Wash., said his company had a third-party auditor conduct a bias audit of its AI tools for a combination of reasons.

“The reality is that regulators from the New York City Council to the Federal Trade Commission have put employment AI bias directly in the crosshairs with both existing and future laws,” Shaddox said. “Performing a bias audit also is an integral part of not just our broader responsible AI program but also our approach to compliance.”

The audit of SeekOut’s platform, conducted by auditor Credo AI, analyzed the performance of AI features in the vendor’s products with respect to various demographic groups, Shaddox said.

“For example, it assessed whether searches for various job titles in our candidate database provided results that were representative of expected demographic information,” he said. “Having an outside measurement point was helpful in both validating and furthering our commitment to minimizing bias in our product offerings.”

Widespread Impact

Evelyn McMullen, a research manager specializing in recruiting and talent management for Miami-based Nucleus Research, said because liability for any AI-created bias in employment decisions ultimately falls to employers and not vendors, it’s vital for HR to play a role in encouraging its vendor-partners to have their AI tools audited—whether the vendor operates in a jurisdiction where legislation requires them to do so or not.

“It’s important for one to bring in an independent auditor that specializes in AI bias to show you’re trying to comply with the EEOC guidelines,” McMullen said. “The first financial penalties will likely fall on employers that fail to at least attempt to address bias in algorithms, even though specific criteria or mandates may not have been set.”

Andrew Gadomski, founder and auditor general of Aspen Analytics, a Ventnor, N.J.-based HR technology advisory firm that conducts AI bias audits, said the need for AI auditing is likely here to stay, given current and pending legislation not just in the United States but around the world.

“This isn’t just a New York issue or an Illinois issue, there’s a lot of pending legislation on the books in other states that will regulate the use of AI for employment decisions,” Gadomski said. “The number of technology vendors that only operate in one jurisdiction, and that only get job candidates from that same jurisdiction, is very low. The need for AI audits isn’t going away.”

An Audit in Action

Another HR technology vendor that opted to have its AI tools audited by a third party is New York City-based Pandologic, which provides programmatic recruitment and conversational AI services. Kristen Boyle, vice president of marketing for Pandologic, said the company chose to have its technology audited in advance of enactment of the NYC law or other legislation that might impact its operations.

“We realized it was important even if it wasn’t necessarily required of us to be proactive as a HR technology vendor to audit our own technology,” Boyle said. “We wanted to ensure we were using our algorithms in the correct way to reduce bias and then share that information with our customers.”

Pandologic hired third-party auditor Vera to review the company’s programmatic algorithms and its conversational AI for bias. The Vera audit team began the process by gathering historical data from Pandologic for analysis.

“The auditor did a series of requirements-gathering in the beginning, working with our product team to analyze the different AI models behind our programmatic technology,” Boyle said. “They wanted to understand which of those models could potentially introduce bias to the process.”

The auditors identified a few algorithms with such potential and “did a deeper dive on them,” Boyle said, recreating a number of use cases of how different customers interact with the Pandologic platform.

On the programmatic recruitment side auditors analyzed Pandologic’s “job expansion” algorithm that’s designed to help customers increase the visibility of their job postings in additional Zip codes, Boyle said.

“We wanted to look at whether targeting those expanded Zip codes was going to yield greater or lesser candidate diversity than the original Zip codes programmed into a hiring campaign,” Boyle said. Audit results came back in a favorable light. “We found the Zip code expansion algorithms were actually increasing diversity of the labor force being targeted with regard to race and gender,” Boyle said.

Pandologic said such an AI audit with the auditor it used could cost anywhere from $20,000 to $75,000 depending on the complexity of the products being audited and the analysis required.

Boyle said a valuable byproduct of the audit were lessons learned from the auditor during the process. “It was an eye-opening experience to us in regard to what questions the auditor asked,” she said. “There’s a lot to be gained from a third-party firm coming in to look under the hood and ask some tough questions. AI is something we intend to scale across additional aspects of our business going forward, so we believed it was important to understand where problems can arise with bias.”

For example, Boyle said one thing the auditor recommended to Pandologic was that it start including a note in its programmatic adverting to let both job seekers and employers know that AI is being used in the process.

How to Choose a Third-Party Auditor

Because AI bias auditing is a relatively new practice with few established norms, experts say it’s vital that HR leaders verify the technology vendors they partner with are using credible auditors with the proper expertise to conduct any AI bias audits.

“The selection of a qualified auditor is especially important today,” said Shaddox of SeekOut. “A third-party AI bias assessment has to be conducted by a neutral third party that has expertise in AI, data analysis, relevant regulations and industry frameworks.”

Gadomski said given the growing demand for AI audits more opportunistic vendors are entering the space, making a “buyer beware” approach critical. “There’s a lot of snake oil in AI auditing already,” he said.

Experts say HR should work with internal legal teams to ensure AI auditors have no financial or fiduciary interest in a company they’re auditing and that the auditor doesn’t sell its own AI products. Auditors also need to understand a company’s risk profile and which current or pending regulations the organization being reviewed may be subject to based on jurisdictions where it operates.

“For example, a HR technology vendor shouldn’t be allowed to filter its own data for an auditor,” Gadomski said in describing sound auditing practices. “An auditor that asks a technology vendor to send it a laptop without any filtering is good practice, for example, as opposed to an auditor that says ‘send me a spreadsheet and I’ll run an audit for you.’ How does the auditor know the latter is a true representation of your data?”

Ben Eubanks, chief research officer for Lighthouse Research and Advisory in Huntsville, Ala., said third-party auditors also should be chosen for their ability to educate as well as for their auditing acumen.

“You should be looking for a partner that is willing to teach, instruct and explain, because at the end of the day you as the HR leader will have to explain it [the audit findings] to your own leadership,” Eubanks said.

Other experts say AI bias auditing is quickly becoming an evergreen commitment and third-party auditors should be viewed as long-term partners. And although an annual audit is required by the NYC law and considered a current norm, some experts believe more frequent audits may be a better practice.

“To be safe, it would be wise for employers to conduct a third-party audit every time an algorithm they use is changed in any way to minimize risk,” McMullen said.

Dave Zielinski is principal of Skiwood Communications, a business writing and editing company in Minneapolis.


​An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.