Breaking News! New York Has a New AI Bias Law

Submitted by Elizabeth on

Artificial intelligence (AI) isn’t sci-fi tech any longer. It now plays a major role in virtually everyone’s life, even if they don’t realize it.

For example, in recent years, it has become increasingly common for employers to use AI technology for the purposes of job recruiting and talent acquisition. Theoretically, AI can help companies make smarter hiring decisions more efficiently by quickly reviewing applications to identify the strongest (and, potentially, the weakest) candidates.

Ideally, AI would also help guard against subconscious biases that can impact a hiring manager’s decision-making. However, some have expressed concern that AI may accidentally “learn” to develop biases for or against certain candidates.

New York City has responded accordingly. It passed the Automated Employment Decision Tools law, with enforcement beginning in April 2023, to guard against unexpected ways AI bias could unfairly affect some applicants when companies make hiring decisions.

What Is This New Law?

The law currently applies to any business with offices in any borough of New York City and serves to ensure that organizations that do use AI in their hiring processes let job candidates know they are doing so. This law may also guard against companies using AI technology that could be prone to biases.

The Purpose of the New Law  

To understand the purpose of this new law, it helps to consider a very basic example of how an AI recruiting tool could “accidentally” allow bias to influence its decisions.

Bias in hiring can be a significant issue both within individual companies and entire industries. And this can influence how workers in those industries and companies perform depending on certain characteristics that might actually have very little to do with their job performance.

For example, in a particular industry, men might be more likely to thrive than women. This could simply be due to the fact that the industry itself has historically been male-dominated. Thus, male workers in this field may not only be hired more frequently than women—but they might also be more likely to receive promotions. 

This trend could affect how an AI technology sorts through job applicants. If the data an AI tool reviewed to learn what qualities make someone a good fit for a role at a particular company taught the AI that men are more likely to succeed at the company than women, this might cause it to prioritize male applicants over female applicants. This would be an instance in which past biases resulted in AI perpetuating those biases.

The purpose of the New York City Council’s Automated Employment Decision Tools law is to protect workers from these potential drawbacks that might come with AI. Contrary to what some may fear, the law will not prohibit businesses in New York from using AI for talent acquisition purposes.

The Key Provisions of the Law

The major provisions of the New York AI bias law are as follows:

  • Audit requirement: If a company wishes to use an AI hiring tool, it must allow an independent auditor to analyze the tool and confirm that it will not be unfairly biased against job applicants due to their race, gender, or ethnicity.
  • Disclosure requirement: Before using an AI recruiting tool, a company must provide at least 10 days advance notice to all candidates and employees living in NYC letting them know an AI tool will play a role in the hiring process.
  • Data limitations: An organization may not be able to use certain data to train an AI tool to identify strong candidates if there is reason to believe that data could result in the AI tool developing a strong bias for or against candidates due to certain protected characteristics.

These provisions serve to demonstrate precisely how the New York City AI bias law may serve its intended purpose. By ensuring AI tools receive approvals before companies use them, and by adding a degree of transparency to the overall process, the law strikes a balance between allowing employers to embrace new technologies whilst continuing to protect workers from unforeseen consequences. 

The Potential Impact of the Law

The potential benefits this law could deliver for workers are numerous. Examples include (but are not limited to) the following:

  • Promoting greater diversity in hiring
  • Minimizing the negative impacts of bias
  • Facilitating proper usage of AI, in which the technology helps companies identify candidates who genuinely possess the qualifications necessary to fill certain roles

However, it’s also important to understand that complying with laws like this one can involve some challenges. For example, it is unclear just how much money it might cost an organization to comply with the law. This could mean smaller companies that lack the necessary funds may not be able to use AI for hiring purposes, putting them at a disadvantage to larger employers. 

On the other hand, while employers who do choose to embrace AI hiring practices will have to familiarize themselves with the potential legal liabilities that doing so could involve, it is possible that this new law will offer said employers an added degree of legal protection. If a company can prove it complied with the law, and an audit confirmed its AI technology is not prone to bias, that company may be less likely to be a target of a discrimination claim or lawsuit.

Politics and Policy  

It is important to understand that AI technology is quite new. When New York City passed the Automated Employment Decision Tools law, it did so in a unique context.

Lawmakers are essentially “playing catch-up” right now. As AI becomes more ubiquitous, they must take swift action to guard against potential unwanted consequences of using these technologies. It is clear those involved in making such decisions should perform more thorough research to inform policy development regarding AI in hiring and recruiting.

What Employers Need to Know About the New York AI Bias Law

Any employer operating in NYC who plans on using AI for recruiting purposes must familiarize themselves with the current law. Failure to comply could result in an employer facing fines and other such penalties.

Hopefully, laws such as this one will benefit everyone. However, they may not entirely eliminate bias in hiring practices.

Do you believe a potential employer denied you a job because of bias? If so, you may have grounds to take legal action. Take the Free Case Evaluation on this page to get connected with an independent, participating employment attorney who subscribes to our site and can tell you more about your options. 

Additional Resources  

Add new comment