EEOC Urges Employees to Review Artificial Intelligence and Other Technology Used in the Selection Process for Bias
According to a recent study by the Society for Human Resource Management, nearly 80% of HR departments using AI and other algorithm-based tools rely on these tools during the hiring process, and over 10% use these tools when making promotion or succession planning decisions.
Last week, the EEOC issued a technical assistance document on Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence emphasizing the need to monitor these tools to ensure they do not disproportionately exclude persons protected under Title VII. (Also, in May 2022, the EEOC released The Americans with Disabilities and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.)
What is Prohibited?
Title VII prohibits selection procedures that adversely impact members of a protected class. One way of calculating adverse impact in hiring is by determining whether the selection rate for the protected class is “substantially” less than the rate for others.
The EEOC first issued guidance on determining and avoiding selection bias in 1978. Early cases analyzed the legality of employment testing scores, educational requirements, physical fitness standards, and other criteria that disproportionately exclude women or minorities. As the EEOC explained last week, this prohibition against adverse impact in selection also applies when software plays a role in the selection process.
What Types of Software are Involved?
The publication addresses any algorithmic decision-making tool used in the selection process when making hiring, promotion, or termination decisions. Examples include:
The EEOC does not provide a definitive method for determining adverse impact. It notes that the four-fifths rule, used since the late 70s, serves as a “general rule of thumb” when assessing selection bias. However, the EEOC also reminds employers that the rule may not be appropriate in all cases, “especially where it is not a reasonable substitute for a test of statistical significance.”
Under the four-fifths rule, selection rates are substantially different if the ratio between the rates is less than 80%. In the EEOC’s example, the algorithm at issue resulted in a 30% selection rate for African American applicants and a 60% selection rate for white applicants. This yields a selection ratio of 50% (30/60). Because 50% is less than the 80% benchmark provided by the four-fifths rule, it could constitute evidence of discrimination in the selection process.
How Can Employers Avoid Adverse Impact?
The EEOC advises employers to test for adverse impact during the software development phase. Doing so allows the employer to create alternative algorithms or select a different tool if the original algorithm shows signs of having an adverse impact on a protected class.
Employers should note they are responsible for avoiding adverse impact in their selection procedures. Thus, even if they use tools designed or administered by a software vendor or other third party, they may be liable for any resulting discrimination. Thus, it is critical to determine whether the products or services result in adverse impact.
What is on the Horizon Legislatively?
Employers can expect additional legislation regulating AI and other algorithmic-based decision-making tools in the selection process. Illinois, Maryland, and New York City already regulate these tools. Other jurisdictions, including California and Washington D.C., are working on similar efforts. Even the U.S. Congress has considered legislation. The Algorithmic Accountability Act, first introduced in 2019, would require companies to evaluate their algorithmic tools for bias and discrimination.
What Actions Should Employers Take Now?
We contribute to the legal field by sharing our experience and insights in the form of articles and presentations designed to improve your way of doing business. You may search by category below, or contact us if you are interested in a field of study not listed here.