Skip to Main Content

Publications

EEOC Releases New Guidance for Employers on Title VII Compliance When Using AI


On May 18, 2023, the U.S. Equal Employment Opportunity Commission (“EEOC”) published a technical guidance document (“EEOC’s Guidance”) regarding the use of artificial intelligence (“AI”) in employers’ selection procedures at all stages, including, but not limited to, hiring, promotion, and firing. The EEOC emphasized that employers’ tests or selection procedures, even those done using AI, must comply with Title VII of the Civil Rights Act of 1964. The EEOC’s Guidance is part of the federal agency’s larger effort to help ensure that employers’ use of AI and other new technologies in employment decision making complies with federal equal employment opportunity (“EEO”) laws by educating employers, employees, and other stakeholders on the application of such laws to new technologies.

Title VII Protections – Background and Basic Principles

Title VII prohibits employment discrimination based on race, color, religion, sex (including pregnancy, sexual orientation, and gender identify), or national origin (“Protected Classes”). It also prohibits employers from using facially neutral tests or selection procedures that have the effect of disproportionality excluding candidates in a Protected Class. For example, a physical agility test that disproportionately screens out women.

In 1978, the EEOC adopted the Uniform Guidelines on Employee Selection Procedures (“Guidelines”) under Title VII. The Guidelines set forth guidance on how employers can determine if their tests and/or selection procedures are lawful for purpose of Title VII’s disparate impact analysis. The Guidelines reference the four-fifths rule as a general rule of thumb for determining whether a selection rate for one group is substantially different than the selection rate for another group. According to the rule, one rate is substantially different than another if their ratio is less than four-fifths (80 percent).

Key Takeaways from EEOC’s Guidance

Through a series of Questions and Answers, the EEOC Guidance sets forth key takeaways for employers about the use of AI in tests or selection procedures. The EEOC relied on Congress’s definition of “AI” to mean “machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments.” Examples of AI-driven selection procedures include resume scanners, virtual assistants or “chatbots” that screen and reject candidates whose qualifications don’t fit a pre-determined criteria, video interviewing software, and software that provide “job fit” or “cultural fit” scores, based on a game or other traditional tests. “Algorithm” refers to the instructions the computer may follow to accomplish a task. In employment, human resources software and applications use algorithms to process data to evaluate, rate, and make other decisions about job applicants and employees.

Takeaway 1: AI-driven tests and selection procedures must not adversely impact candidates in Protected Classes.

The EEOC Guidance makes clear that if an algorithmic decision-making tool has an adverse impact on candidates in a Protected Class or candidates with a combination of protected characteristics (e.g., a combination of race and sex, such as Asian women), then the use of the tool violates Title VII unless the employer can show that: (1) such use is job related and consistent with business necessity; and (2) the employer did not refuse to adopt an alternative employment practice.

Takeaway 2: Employers can use the four-fifths rule to determine whether the use of an AI-driven test or selection procedure causes substantially different selection rates for different groups.

The EEOC Guidance provides that employers can assess whether a selection procedure has an adverse impact on a Protected Class by checking whether its use causes a selection rate for individuals in a group that is substantially less than the selection rate for individuals in another group. A selection rate is the proportion of applicants or candidates who are selected under the selection procedures. To make this determination, the EEOC Guidance provides that employers can use the fourth-fifths rule.

The four-fifths rule is the general rule of thumb to determine whether the selection rate for one group is substantially different than for another. It states that one rate is substantially different than another if their ratio is less than four-fifths (80 percent). For example, if 80 White applicants and 40 Black applicants take a personality test that is scored using an algorithm and 48 White applicants and 12 Black applicants proceed to the next round of selection procedures, the selection rate is 60 percent (48/80) for the White applicants and 30 percent (12/40) for the Black applicants. The ratio of the two rates is 30/60 or 50 percent which is lower than 4/5 or 80 percent. Thus, in this example, the selection rate for Black applicants is substantially different than that of the White applicants, which could be evidence of discrimination against Black applicants.

It is important to note that compliance with the four-fifths rule does not guarantee that a test or selection procedure complies with Title VII. The EEOC Guidance makes clear that the four-fifths rule is a “practical and easy-to-administer” test that may be used to draw an initial inference that a selection rate for two groups may be substantially different, and then prompt a more in-depth inquiry into the procedure in question. In fact, courts agree that the four-fifths rule may be inappropriate in some cases, including where it does not reasonably substitute a test of statistical significance. Accordingly, compliance with the rule alone may not be sufficient to show that a particular selection procedure is lawful under Title VII. Employers should only use the rule as a starting point to consider whether a tool is causing a disparate impact.

Takeaway 3: Employers may be held liable under Title VII even if they use third-party vendors to create or administer AI-driven tests or selection procedures.

According to the EEOC Guidance, employers may be responsible under Title VII for a selection procedure that is discriminatory, even if it is developed or administered by an outside vendor. This includes, for example, an employer that asks a third-party vendor to create automatic resume-screening software and video interview software that evaluates applicants based on their facial expressions and speech patterns.

Thus, prior to relying on software vendors, employers need to conduct due diligence with the vendors. This may include asking the vendors whether they have taken steps to evaluate whether the use of the AI-driven test or selection procedure would substantially lower the selection rate for candidates in Protected Classes and whether the vendor relied on the four-fifths rule or on a standard such as statistical significance when making this evaluation. If the vendor states that such an impact should be expected, the employer should consider whether: (1) the use of the tool is job related and consistent with business necessity; and (2) there are alternatives that may meet the employer’s needs and have less of a disparate impact. It is important to keep in mind that if the vendor is incorrect in its own assessment about whether the tool results in disparate impact, the employer could still be liable.

Takeaway 4: Employers need to educate their employees and continuously evaluate AI-driven tests or selection procedures.

AI and other new technology offer new efficiencies for employment decision. Employers can continue to use AI and other algorithmic-based methods to screen applicants and engage in other employee evaluations. However, the EEOC Guidance makes clear that the use of technology will not insolate employers from discrimination claims. Accordingly, employers have an obligation to understand how the technology works, understand the requirements of Title VII, and take steps to ensure the use of such technology does not disparately impact Protected Classes without a justifiable basis. This can be done through education and regular evaluation of technology-driven selection procedures. If an employer finds that a tool is resulting in disparate impact, it must take steps to reduce the impact or select a different tool. The EEOC Guidance encourages employers to conduct self-analyses on an ongoing basis to determine whether their practices are having a disproportionate negative effect on a Protected Class.

This new EEOC Guidance is a signal for employers that this is an issue that it plans to focus on. Accordingly, employers utilizing AI or planning to utilize AI or other new technologies need to take steps to ensure they are in compliance with Title VII.


Hinckley Allen’s Labor & Employment attorneys are experienced in Title VII compliance for employers. Please reach out to the Hinckley Allen Labor & Employment Group with questions or for assistance in ensuring that your employment selection procedures are compliant with Title VII requirements.

Research and drafting assistance provided by summer associate Bidushi Adhikari.