Should Employers Be Liable When Their AI Tools Break the Law?
A new law in New York City that bans employers from using AI tools in hiring unless they're annually audited for bias raises many legal questions.
A growing number of companies report using artificial intelligence tools to help manage their businesses. Policymakers are growing concerned these tools could be misused or result in unintended consequences.
In New York City, these trends have dovetailed in a new employment law that illustrates one way policymakers aim to regulate AI tools: by holding companies accountable for any harms caused by the technology they are increasingly buying and using. But for some employment attorneys, this strategy fails to account for one circumstance: Many employers don’t have the technical skill to assess the impact of the tools they’re using.
Effective Jan. 1, 2023, employers and employment agencies in New York City will be banned from using AI tools to guide their hiring and promotion decisions—unless those tools are annually tested to ensure they don’t discriminate against job applicants and workers based on their race, ethnicity and sex.
They also will have to give workers 10 business days’ notice before subjecting them to an audited tool, and must tell them they can request “an alternative selection process or accommodation.”
The law, the first of its kind in the U.S., aims to address growing concerns that AI can perpetuate biases and screen out qualified job candidates.
But some digital rights advocates wish it went further. They argue the measure should be expanded to apply to employment decisions beyond hiring and promotion, and require audits that check AI tools for a wider range of biases, such as those based on a workers’ disabilities, age or sexual orientation.
The legislation, passed by the New York City Council in November, subjects employers and employment agencies to fines if they fail to comply. But Randi May, who represents employers as a partner at Hoguet Newman Regal & Kenney, said the onus for compliance should fall on the companies that develop and sell these AI tools, too.
Most employers and the human resources professionals tasked with using AI tools are not artificial intelligence experts, she said. They “don’t know if the tool is inadvertently going to have a disparate impact. [They] don’t necessarily understand … the algorithms.”
Employers should “lean on the AI tool providers more, and tell them that there’s this law, and ask them what their intentions are, and how they’re planning to comply,” May said. “If you want us to keep using your [tool], you have to give us something that’s compliant. Otherwise, we’re going to go someplace else and get a tool that is compliant there.
“If I were a plaintiff’s attorney, and I wanted to sue somebody based on the tool … it would be a class action against the employer as well as the AI company,” she added. “I wouldn’t rule it out.”
But AI tool providers don’t necessarily know how to make sure their clients are complying with the law, either.
While some of these companies suggest they understand the assignment—the founder of AI recruiting platform Pymetrics, for example, has said the company checks its tools for disparate impact, and clients of such companies as Suited and HireVue say the tools improve workforce diversity—the New York City law does not provide clear criteria for compliance, attorneys say.
While the law requires a bias audit, it does not provide details on who qualifies as an “independent” auditor, or the criteria auditors should rely on to determine whether a tool has passed an audit, said James Paretti Jr., a shareholder at Littler Mendelson.
Filling in those blanks might not be straightforward, Paretti said. Noting an initiative the U.S. Equal Employment Opportunity Commission launched last fall to examine AI employment tools and whether they comply with civil rights laws, the attorney said, “They are just in the process of starting a task force to try to dig in and understand what some of these issues are.”
“If the EEOC is saying we need to know more here, I would have thought that the New York City Council … whose area primary responsibility is not the enforcement of non-discrimination and employment laws … would need that education as well,” Paretti said.