- The city law requires audits, disclosures from companies
- Could serve as model for future legislation
A first-of-its-kind law requiring companies to audit artificial intelligence systems for bias when using them in hiring and promotion decisions recently went into effect in New York City, leaving employers and auditors scrambling to comply.
As of last week, New York City employers have to independently audit their systems for bias and publish the results on their company websites, or face fines.
The law comes amid a nationwide push to regulate AI’s use at work from both lawmakers and the Equal Employment Opportunity Commission. Illinois and Maryland have laws on the books regulating the use of AI in video job interviews, and other states are weighing their own legislation on AI bias in hiring.
“Somebody had to go first and sort of start somewhere, and I think that’s where we are with New York City,” said Victoria Lipnic, a former acting chair of the EEOC.
Under the New York City law, employers will have to take an inventory of their hiring and recruitment software to determine if any of it constitutes an “automated employment decision tool,” said Nathaniel Glasser, a partner at Epstein Becker Green who represents employers.
The AEDT distinction means the law will cover many AI-powered tools, including those used for screening candidates and parsing their resumes, as well as automated systems used to interview potential hires.
Employers On Alert
One challenge for companies seeking to comply is that they don’t always collect data with regards to race and gender when evaluating job candidates, which could make the audits less meaningful, Glasser said.
The final rule says that if an employer has historical data, it has to use it in its audit. If an employer doesn’t collect this data, the employer may use “test data” that is not necessarily affiliated with the company, as long as it makes an additional disclosure in its public report.
“Obtaining sufficient information to conduct a statistically significant bias audit may be a problem and it’s one that a number of employers are working through right now,” Glasser said.
The law also places responsibility for audit and disclosure errors solely on the employer, instead of an outside auditor they may hire to handle the process.
“I just think that to the extent there’s a deficient audit, or there’s some issue with the software, you’re gonna find employers on the hook when I think in most instances, they’re outsourcing third party software,” said Lara Shortz, a labor and employment partner at Michelman & Robinson LLP.
There’s no private right of action under the statute. Employers found in violation face a $500 fine on the first offense and a $1500 on each subsequent one.
Though the New York City law was passed in late 2021, the final rule on its implementation wasn’t issued until April of this year.
“The city was very slow in promulgating the rules that explained the law and there was a lot of ambiguity and I don’t think there was enough time to prepare,” said Eli Freedberg, a partner at Littler Mendelson. “I think a lot of companies might be considering putting a pause on the use of AI based tools until they can really assess whether the tools they are using qualify as regulated tools.”
Auditors On Call
The landscape for the firms employers call on to complete bias audits is also changing as they deal with an influx of new business.
An important provision in the New York City ordinance is that the AI bias audit be conducted by a third party who has no vested financial or other interest in the employer. This means employers are working quickly to line up AI consulting firms and third-party vendors to conduct audits that comply with the new law.
Jacob Appel, chief strategist at ORCAA, an AI consulting firm that audits the technology, said the company has seen a significant increase in demand since the New York City legislation passed.
“Now that the law has taken effect, we’re seeing more interest than ever,” Appel said.
Lipnic, who is now a partner at Resolution Economics, a firm that conducts AI audits, said sometimes auditors have to go to the employers multiple times to ensure they’ve received the proper data.
“It’s highly dependent on what kind of shape the data is in,” Lipnic said. “That could take anywhere from a couple of weeks to longer than that if you have to go back for multiple data requests.”
Prior to the New York City ordinance, there had been no set legal framework for what bias audits entail, only industry players developing best practices.
Auditing firms have said they don’t anticipate the new requirements will drastically change how they were conducting audits, but that they have been working with legal counsel to make sure their existing systems comply with the law.
A Nationwide Framework?
The New York City law may provide a model for auditing rules that can be adopted in other jurisdictions, but notably it doesn’t provide a legal threshold for when AI-based hiring or recruiting is considered to be biased or to have a disparate impact on protected groups.
The EEOC has provided limited endorsement for a “rule of thumb,” which says if a hiring test has a selection rate less than 80% for a protected group, compared to others, then the gap likely indicates bias.
Future state and city laws across the country could provide additional clarity.
In New York City, race, ethnicity, and sex are the only protected classes that bias audits are legally required to take into account. Other groups protected under Title VII of the Civil Rights Act of 1964, as well as those covered by the Americans With Disabilities Act, aren’t included in the required audits.
Jurisdictions outside of New York are likely to expand the law to include other protected categories, Shortz said. California, Washington, D.C., and New Jersey all have proposed legislation regulating AI usage in hiring, with some bills even more expansive than New York City’s statute.
“On the New York side, feels like they’re trying to keep it as simplified as possible. In other words, you use the software and then you get the audit and you publish the summary of the audit and then you can move forward.” Shortz said. “In California, it’s more of a generalized prohibition against discrimination when using the software.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Tax or Log In to keep reading:
See Breaking News in Context
From research to software to news, find what you need to stay ahead.
Already a subscriber?
Log in to keep reading or access research tools and resources.
