in ,

New York City Proposes Rules to Clarify Upcoming Artificial Intelligence Law for Employers


October 3, 2022

Click for PDF

Finally, some welcome news for employers who utilize automated employment decision tools (“AEDT”) in New York City: the Department of Consumer and Worker Protection (“DCWP”) has proposed rules in an attempt to clarify numerous ambiguities in New York City’s Artificial Intelligence (“AI”) law, which takes effect on January 1, 2023.[1]

New York City’s law will restrict employers from using AEDT in hiring and promotion decisions unless it has been the subject of a bias audit by an “independent auditor” no more than one year prior to use.[2]  The law also imposes certain posting and notice requirements to applicants and employees.

As detailed below, the DCWP’s proposed rules are currently under consideration and may well invite more questions than answers as uncertainty about the requirements lingers.  Comments can be submitted to the DCWP, and a public hearing will be held on October 24, 2022 to determine whether any or all of the rules will be formally adopted.  Below is a brief summary of the proposed rules.

Clarifying Definitions:  Several key terms that are not defined in the law itself will be defined if the proposed rules are passed.

For example, the proposed rules define “independent auditor” as “a person or group that is not involved in using or developing an AEDT that is responsible for conducting a bias audit of such AEDT.”  Although the proposed definition signals that a vendor who developed the AEDT may not be a sufficiently “independent” auditor (depending on the facts and circumstances), the proposed rules provide an example of a vendor permissibly providing data for a bias audit.  It remains to be seen whether there will be further clarification regarding which vendors may conduct the required bias audit.

The proposed rules define “candidate for employment” as “a person who has applied for a specific employment position by submitting the necessary information and/or items in the format required by the employer or employment agency.”  As such, the proposed rules clarify that potential applicants who have not yet applied for a position would not be covered by the new law.

The AI law itself defines an AEDT as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.”  The proposed rules clarify that the phrase “to substantially assist or replace discretionary decision making” means that the covered tool (a) relies “solely on a simplified output,” (b) uses “a simplified output as one set of criteria where the output is weighted more than any other criterion in the set,” or (c) uses “a simplified output to overrule or modify conclusions derived from other factors including human decision-making.”

Bias Audit:  The proposed rules also specify the requirements for a bias audit, which include calculating the selection rate and impact ratio for each EEO-1 category on an employer’s Equal Employment Opportunity Commission Employer Information Report (i.e., race, ethnicity, and sex).

The calculations set forth in the proposed rules are generally consistent with the EEOC’s Uniform Guidelines on Employee Selection Procedures.  Notably, the proposed rules explain that the selection rate is “the rate at which individuals in a category are either selected to move forward in the hiring process or assigned a classification by an AEDT” as compared to the total number of individuals in the category who applied for a position or were considered for promotion.  Meanwhile, the impact ratio is defined as “either (1) the selection rate for a category divided by the selection rate of the most selected category or (2) the average score of all individuals in a category divided by the average score of individuals in the highest scoring category.”

The proposed rules provide examples of bias audits that indicate that such audits must conduct an intersectional analysis of protected categories (e.g., examining the impact rate for race and sex combined) in addition to analyzing each category independently.  The proposed rules do not address situations in which data may be incomplete for certain categories.  Nor do the proposed rules address circumstances where the data set is too small to give rise to a statistically significant impact ratio.

With respect to publishing the results of the bias audit, the proposed rules would require the posting be made “publicly available on the careers or job section of their website in a clear and conspicuous manner,” and include the date of the audit, the distribution date of the tool, and the selection rates and impact ratios for all categories.

Notice:  The New York City law requires employers to provide advance notice to individuals who reside in New York City at least 10 business days before use of the AEDT, the opportunity to request an alternative selection process or accommodation, the job qualifications or characteristics that the AEDT will use in connection with the assessment, the employer’s retention policy, and the type and source of data collected for the AEDT.  The proposed rules outline several different ways by which, if passed, employers may provide notice for candidates and employees.

For the law’s requirement of notice regarding the use of an AEDT, instructions for how to request an alternative selection process or accommodation, and the job qualifications and characteristics used by the AEDT, the proposed rules would allow employers to provide notice to applicants (a) “on the careers or jobs section” of an employer’s website, (b) “in a job posting,” or (c) “via U.S. mail or e-mail” at least 10 business days prior to use of the AEDT.  For employees, employers would be able to provide this notice “in a written policy or procedure” at least 10 business days prior to use or through the mechanisms outlined in (b) and (c) above.

Under the proposed rules, an employer would satisfy the law’s requirement of notices regarding the type of data collected, the source of the data, and the data retention policy by posting this information “on the careers or jobs section” of their website or by providing it in writing “via U.S. mail or e-mail” within 30 days of receiving a request to provide such information.

*     *     *

Legislatures and regulatory agencies have continued to focus on employers’ use of AEDT.[3]  Most recently, on September 13, 2022, the EEOC co-hosted an event with the OFCCP titled Decoded: Can Technology Advance Equitable Recruiting and Hiring?.  During the event, EEOC Chair Charlotte A. Burrows and OFCCP Director Jenny R. Yang underscored the need for employers to think carefully about the factors that AEDT are assessing, including whether those factors are tailored to the skills and abilities required by the specific position, and to ensure that AEDT do not have a disparate impact based on protected categories.  Accordingly, employers who have already implemented or may implement AEDT in the workplace should consider the impact of these legislative and regulatory developments to ensure compliance with upcoming laws and enhanced regulatory scrutiny.

_________________________

[1] NYC Dep’t Consumer & Worker Prot., Notice of Public Hearing and Opportunity to Comment on Proposed Rules, https://rules.cityofnewyork.us/wp-content/uploads/2022/09/DCWP-NOH-AEDTs-1.pdf.

[2] For more details, please see Gibson Dunn’s New York City Enacts Law Restricting Use of Artificial Intelligence in Employment Decisions.

[3] For more details, please see Gibson Dunn’s Keeping Up with the EEOC: Artificial Intelligence Guidance and Enforcement Action and Danielle Moss, Harris Mufson, and Emily Lamm, Medley Of State AI Laws Pose Employer Compliance Hurdles, Law360 (Mar. 30, 2022), available at https://www.gibsondunn.com/wp-content/uploads/2022/03/Moss-Mufson-Lamm-Medley-Of-State-AI-Laws-Pose-Employer-Compliance-Hurdles-Law360-Employment-Authority-03-30-2022.pdf.


The following Gibson Dunn attorneys assisted in preparing this client update: Harris Mufson, Danielle Moss, and Emily Maxim Lamm.

Gibson Dunn’s lawyers are available to assist in addressing any questions you may have regarding these developments. To learn more about these issues, please contact the Gibson Dunn lawyer with whom you usually work, any member of the firm’s Labor and Employment practice group, or the following:

Harris M. Mufson – New York (+1 212-351-3805, hmufson@gibsondunn.com)

Danielle J. Moss – New York (+1 212-351-6338, dmoss@gibsondunn.com)

Jason C. Schwartz – Co-Chair, Labor & Employment Group, Washington, D.C.
(+1 202-955-8242, jschwartz@gibsondunn.com)

Katherine V.A. Smith – Co-Chair, Labor & Employment Group, Los Angeles
(+1 213-229-7107, ksmith@gibsondunn.com)

© 2022 Gibson, Dunn & Crutcher LLP

Attorney Advertising:  The enclosed materials have been prepared for general informational purposes only and are not intended as legal advice.



Source link

What do you think?

Health care artificial intelligence gets biased data that creates unequal care

Three Reasons Why Artificial Intelligence Wins With Specialization