Share

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus convallis sem tellus, vitae egestas felis vestibule ut.

Error message details.

Reuse Permissions

Request permission to republish or redistribute SHRM content and materials.

Employers, Vendors Plan Ahead of NYC's AI Law Enforcement Date


Aerial view of manhattan, new york city.


​While New York City employers have more time to prepare for the enforcement of the city's automated employment decision tool (AEDT) law in April, there's still some doubt that companies will be ready to meet the law's requirements. Some stakeholders even expect the law to delay the use of artificial intelligence tools in hiring and promotion decisions.

The AEDT law, which was passed in December 2021, had an original enforcement date of Jan. 1, 2023, but that date has been pushed back to April 15, a decision that New York City employers hope will give them enough time to conduct bias audits on their AEDTs. 

To address the concerns of employers, vendors and other stakeholders, the city's Department of Consumer and Worker Protection held its second public hearing on the AEDT law in January, but changes to the law have yet to be made.

The Society for Human Resource Management hosted a panel discussion about the law on Feb. 16, allowing the employer community to share feedback with New York City Council members and staff from the mayor's office.

In the meantime, Terry Baker, president and CEO of recruitment marketing technology firm PandoLogic, thinks the delay is a sign that New York companies' hiring systems may not meet the law's requirements.

"There is recognition that the market is not ready for compliance and adoption," he said. "Most employers have not yet audited the tools they use which will be subject to the AEDT governance. In addition, the NYC Department of Consumer and Worker Protection still needs to clarify what constitutes an adequate audit under the law."

PandoLogic will be subject to the law both as a company headquartered in New York City and as a vendor that offers its customers a talent acquisition platform and a conversational AI tool that automates the collection of information from candidates for the purpose of vetting them for specific job openings.

"We certainly fall under this law both as an employer and as a technology provider, and we are particularly attuned to what the requirements are," Baker said.

However, because the requirements to successfully audit automated hiring systems aren't clear, both employers and vendors are in a difficult position, he said.

"The law defines impact ratio but does not yet define the kind of auditing procedures that would determine whether the audit was done correctly or adequately. Vendors that provide these tools are not yet prepared to demonstrate compliance for that reason," Baker said.

The law says an AEDT is any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that generates a score, classification, or recommendation used to substantially assist in making employment decisions.

Because employers are liable under the AEDT law, Baker said, it's critical that vendors with AI and other automated tools used to score and select candidates explain how their tools work—but that's not easy to do.

"Vendors must provide their customers with exposure and transparency because it's the employers that have the liability and it's the employers that have to communicate with the candidates what this process looks like to establish the fairness of the process. They can't do that if they are using third-party tools that don't provide that level of transparency," Baker said.

He added that many companies use a lot of platforms that haven't gone through an audit, and "it's very difficult when using a third-party product to understand what's happening within that code base."

New York University's Hilke Schellmann, an assistant professor of journalism, and Mona Sloane, senior research scientist at the NYU Center for Responsible AI have concluded that AI in hiring is undergoing an earth-shaking revolution.

"Many Fortune 500 companies employ AI-based solutions to weed through the millions of job applications the companies receive every year," according to Schellmann and Sloane's project, titled "Holding Hiring Algorithms Accountable and Creating New Tools for Humanistic Research."

"The problem: many companies don't want to reveal what technology they are using and vendors don't want to reveal what's in the black box, despite evidence that some automated decision making systems make biased and/or arbitrary decisions," the researchers said.

As companies gauge the damage to their reputation and the costs they'll incur if they are found in violation of the law, Baker said employers in New York City should consider placing more responsibility on vendors and pay lawyers to add amendments to service level agreements that protect them.

"If employers are smart, they are going to pass the onus of this law to the vendors. Anybody that is hiring at scale in New York City is investing in a lot of third-party products. The average company uses more than 10 third-party tools to enable their entire talent acquisition and employment process," he said.

Data from Aptitude Research shows that 63 percent of companies are using more talent acquisition solutions today than in the pre-pandemic period. As talent acquisition tools evolve, researchers say, AI remains the common denominator. 

The cost to employers of breaking the law is another consideration. Each violation of the AEDT law can cost companies up to $1,500. That's a punishment Baker thinks will convince employers to delay their use of AI in hiring.

"I think the law will slow the adoption of AI, unfortunately, because there's a lot of fear with regard to compliance, and there are some stiff penalties," he said.

[SHRM members-only HR Q&A: What is artificial intelligence and how is it used in the workplace?]

One company that is adamant that its technology will escape the law's penalties is SeekOut, a Seattle.-based company with a significant number of New York City clients that use its AI-driven talent acquisition and management platform.

According to Sam Shaddox, vice president and head of the legal department at SeekOut, the company has intentionally designed its AI solutions so they don't make hiring decisions, but instead find diverse candidates for employers to reach out to.

"One of the clarifications that the draft implementing regulations make is that the audit and transparency requirements only apply to AI tools that are being used for candidates that have applied to a position," Shaddox said.

He added that the AEDT law and similar legislation will force small and medium-size businesses, as well as vendors, to have a more robust compliance team.   

As to whether the law will limit the use of AI tools in hiring, Shaddox said companies will take a variety of approaches to using AI and automation tools to support their hiring objectives.

"Some employers will reduce the use of AI systems, and some will have greater trust and will use AI systems more. Over the next two to four years, we will see how the law turns out before we start to see an emerging trend on whether employers are excited or not with their use of AI in hiring processes," he said.

Nicole Lewis is a freelance journalist based in Miami. 

Advertisement

​An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.

Advertisement