Bloomberg Tax
Free Newsletter Sign Up
Bloomberg Tax
Welcome
Go
Free Newsletter Sign Up

Artificial Intelligence Can Help or Harm Employment Practices

May 17, 2022, 8:22 PM

Payroll and human resource offices adopting artificial intelligence in their employment practices should be aware of the associated limitations and legal risks, a commissioner of the Equal Employment Opportunity Commission said May 12.

AI is a form of technology that mimics humans by learning, reasoning, and problem-solving with a high level of autonomy, said Keith Sonderling, an EEOC commissioner. Payroll and human resource offices have slowly been adopting AI over the past few years to make their employment processes more efficient.

“AI has been involved in the decision-making stage of the job life cycle for years,” he said. “AI writes job descriptions, screens resumes, chats with applicants, conducts job interviews, and then predicts if an employee will accept an offer. There’s even AI that will predict how much money an employee will take for a job and how new employees will interact with their coworkers.”

One of the most notable benefits of AI is that it can eliminate discrimination bias if used correctly, he said. For example, AI can be programmed to disregard aspects of a resume that have no bearing on an applicant’s potential job performance, such as an applicant’s name, address, or college. This can improve a company’s workforce diversity and culture.

“AI can help eliminate bias from the early stages of the hiring process,” Sonderling said. “What does an applicant’s name have anything to do with their job qualifications? A job applicant’s name may signal correctly or incorrectly certain variables, such as their sex, national origin, religion, or race. Likewise, an AI-enabled program that conducts preliminary screening reviews can be engineered to disregard factors such as age, sex, race, disability, and pregnancy. It can even disregard variables that might merely suggest a candidate’s inclusion in a protected class.”

However, AI can reflect a company’s current discrimination bias if used improperly, he added. AI requires data to determine how it will function, and data from a company with a homogeneous workforce will only accentuate the bias and status quo that already exists.

For example, many AI programs recognize White faces more than faces of dark-skinned individuals, Sonderling said at the American Payroll Association’s 40th Payroll Congress. That is a form of discrimination, and employers will be held liable even if they did not intend the AI program to discriminate based on race.

“In HR and payroll, data is everything,” he said. “Data makes the decision between a good hire and a bad hire, a good promotion and a bad promotion, and what I’m trying to raise awareness of is the difference between lawful and unlawful decisions.”

Legal Compliance

Many states and localities have passed laws regulating certain aspects of AI use in human resources and payroll, but they vary greatly, Sonderling said. Employers should be aware of these laws to ensure they are not relying on illegal practices.

“What we’re seeing from these states is that they are making it impossible to use facial recognition, or they are imposing audit requirements and having disclosure requirements and employee consent,” he said.

All employers, regardless of their use of AI technology, should review the EEOC’s employee selection procedures to ensure compliance with EEOC guidelines and regulations, he said. Employers that are thinking about purchasing AI technology should ask vendors about how their services are tested, audited, and in compliance with relevant federal and state guidelines.

“Our laws might be old but they are not outdated,” he said. “Whatever the algorithm you are using, whatever decision, watch for those discriminatory uses and the outcomes. We cannot realize the full potential of AI unless it is developed and utilized consistent with our laws.”

To contact the reporter on this story: Emmanuel Elone in Washington at eelone@bloombergindustry.com

To contact the editor on this story: William Dunn at wdunn@bloombergindustry.com; Jazlyn Williams at jwilliams@bloombergindustry.com