Skip to content

New York City’s Automated Employment Decision Tool Law Fair Game For Enforcement Beginning July 2023

Artificial intelligence (AI) has become increasingly common in the workplace, as AI tools allow employers to quickly and efficiently evaluate both prospective and existing employees. Just last year, the Equal Employment Opportunity Commission (EEOC) estimated that over 80% of employers use AI to assist in employment-related decision-making. Yet, regulation of workplace AI remains sparse. In New York City, that is set to change next month when the New York City Automated Employment Decision Tool Law (AEDT) will be enforced for the first time.

New York City first passed the AEDT in December 2021, making it unlawful for employers to use automated decision-making tools to screen individuals for employment decisions unless the law’s requirements regarding notice, bias audits and disclosure are satisfied. After lengthy opportunity for public comment and several delays of the law’s enforcement date, those requirements go into effect on July 5.

In this alert

Who Must Comply?

As a threshold issue, it remains unclear precisely to whom the AEDT applies. Employers located in New York City as well as employees or applicants residing therein are clearly covered. However, it is less clear whether the regulation also covers companies located outside the city but hiring NYC residents, or companies based in NYC but hiring applicants who may reside outside the city. At this early stage of enforcement, employers that are either located in NYC or making employment decisions regarding employees or applicants residing in NYC should assume they must comply with this law.

What AI Tools Are Covered?

The first issue for employers is understanding what qualifies as an “automated decision-making tool” under the regulations. The definition of covered decision tools was a hotly debated issue during the two AEDT public commentary hearings. Ultimately, the guidance provided indicates that the AEDT is intended to apply to (1) computational processes, derived from machine learning, statistical modeling, data analytics or artificial intelligence, that (2) issue a simplified output, including a score, classification or recommendation, that (3) substantially assists, outweighs, overrules, or replaces decisions or conclusions made by an employer (4) with respect to employment decisions relating to hiring and promotion.

This remains a complex definition that will almost surely be refined over time. But for now, it appears clear that only tools used for hiring and promotion (and therefore not termination) are covered. It also appears clear that only tools that replace or override human decision-making apply. Therefore, tools that provide scores that are only a factor of an otherwise human decision-making process may be exempt. But AI tools involved in initial screening or interview selection processes, where the AI tool selects candidates and replaces any human screening efforts, are almost certainly subject to the requirements of the law.

What Are the Requirements for Covered AI Tools?

There are three primary requirements associated with the AEDT: (1) independent annual bias audits, (2) notice of use and (3) disclosures related to data collection.

  • Independent Annual Bias Audit and Publication of Results—The AEDT requires that any covered AI tools be subject to an annual bias audit performed by an independent source. The AEDT clarifies that “independent” means that the audit is performed by someone outside the employer—so internal groups, departments or other entities working for the employer or financially interested in the AI tool will not meet the requirements. Generally, the audit must evaluate potential disparate impact that results from the AI tool for each EEO-1 protected category. Employers may use either historical or test data to conduct the audit. Finally, employers must publish the results of their AEDT bias audits to either their careers page or an external website and include the date of the most recent audit, a summary of the results (including the data source, the number of candidates and any individuals excluded from the audit) and the date the employer began using the AI tool.
  • Notice of Usage—The AEDT requires employers to notify both job applicants and existing employees whenever a covered AI tool will be used for a qualifying purpose (such as in connection with an applicant’s potential hire). This notice must be given no fewer than 10 business days before an individual is evaluated by the AI tool. The notice must also include instructions detailing how individuals may request an alternative selection process, although the AEDT does not provide guidance on what alternative processes would be acceptable and actually indicates that an employer may not even be required to provide any such alternatives.
  • Data Collection and Disclosures—The AEDT requires that employers provide information related to the type of data collected, the source of that data and the employer’s data retention policies. Notice of these disclosures must be posted on the employer’s website and include instructions for making written requests for that information. Following such a request, the information must be provided within 30 days or the employer must explain why such disclosure cannot be made.

What Are the Penalties for Failing To Comply?

Failure to comply with the AEDT may result in significant penalties—violations are subject to an initial civil penalty of $375 for each violation on the first day of improper use of the AI tool or other failure to comply with the law. Thereafter, subsequent violations are punishable by civil penalties of between $500 and $1,500. Critically, each failure to comply with requirements of the AEDT is treated as a separate violation that results in a separate penalty—for example, failure to meet the notice requirements and using an AI tool that has not been subject to a timely bias audit are separate requirements that may incur penalties for each day of noncompliance.

What Employers Need To Do

The quickly approaching AEDT enforcement date of July 5, combined with the law’s remaining uncertainties, means employers operating in or hiring or employing individuals in NYC need to immediately conduct a thorough AEDT compliance review, in which they do at least the following:

  • Review their employment decision-making technology to determine whether these tools implicate the AEDT.
  • Suspend use of any covered tools that have not been subject to a required audit until the bias audit is complete.
  • Identify independent partners to conduct required AEDT audits for any covered tools.
  • Internally review company policies, practices and procedures related to data retention, job postings and promotions to ensure compliance with the final AEDT regulations.
  • Conduct training sessions for human resources and management personnel regarding the requirements of the AEDT and how it impacts candidate interactions, internal and external job and promotion postings, and notice requirements. 

For a more complete analysis of the AEDT, its requirements and ways employers can ensure compliance, see our full three-part AEDT series, available here.

Employers should also be mindful that while the AEDT is one of the first wide-ranging and comprehensive regulations of AI tools in employment decisions, it is not alone. For example, a 2019 Illinois law requires job applicants to consent before AI analysis of video interviews, a 2020 Maryland law mandates applicant consent before employers may use facial recognition technology in job interviews, and a pending Washington, D.C., law would prohibit algorithmic decision-making in employment offers.

Likewise, the EEOC has identified the use of AI tools in employment as a strategic enforcement priority. Just last month, the EEOC issued guidance in the form of questions and answers on the use of AI tools in connection with the requirements of Title VII of the Civil Rights Act. The agency has previously issued guidance addressing the use of AI tools in connection with the requirements of the Americans with Disabilities Act. The EEOC guidance makes clear that AI tools that result in a disparate impact for protected classes of applicants or employees will run afoul of these laws even if there is no intent or human influence from the employer that creates the disparate impact, and even if the tool is developed by a third-party vendor. Notably, the EEOC guidance is also broader than the requirements of the NYC AEDT and indicates that AI tools that “make or inform decisions about whether to hire, promote, terminate, or take similar actions towards applicants or current employees” may be covered selection tools under Title VII. This guidance goes beyond the requirements of the NYC AEDT, which limits covered AI tools to those that “substantially assist or replace” human decision-making.

Given the increasing focus by legislators and regulators on AI tools in employment, employers both within and outside New York City should carefully assess the tools they are using to make employment decisions and assess whether bias audits are required by applicable law or are advisable under the circumstances even where not required.