Skip to main content

Command Palette

Search for a command to run...

Are You Ready for NYC’s Local Law 144?

A Practical Guide to Complying with New York City’s AI Hiring Law

Updated
5 min read
Are You Ready for NYC’s Local Law 144?

Artificial intelligence is transforming how companies find and evaluate talent — but with that innovation comes responsibility. New York City’s Local Law 144, which took effect on January 1, 2023, with enforcement beginning July 5, 2023, is one of the first laws in the United States to regulate the use of AI in hiring and promotion decisions.

If your organization, in New York City, uses AI-driven systems or automated tools to screen candidates, you’re now required to audit those tools for bias, be transparent about how they’re used, and notify applicants in advance.

This law represents more than a compliance requirement — it’s a sign of how technology, fairness, and employment law are beginning to intersect.

What Are Automated Employment Decision Tools (AEDTs)?

Under Local Law 144, an Automated Employment Decision Tool (AEDT) is any software or algorithm that uses machine learning, AI, or statistical models to help make hiring or promotion decisions.

In plain terms, if your system gives a score, ranking, or recommendation that influences who gets hired, interviewed, or promoted, it’s likely considered an AEDT.

Examples of AEDTs:

  • Resume-scanning software that ranks candidates

  • AI video interview tools that assess tone or facial expressions

  • Predictive analytics systems that score candidates based on performance data

What’s not considered an AEDT:
A tool is generally not an AEDT if it doesn’t use machine-learning or statistical models to substantially assist or replace discretionary decision-making about candidates.
Examples: a plain spam filter, a firewall rule, or a database that stores résumés but never “scores” them.

Does the law even apply to my role?
LL 144 triggers only when an AEDT is used “in the city.” That means, either:
(a) the job is tied to a New York City office (even part-time)
(b) the role is fully remote but associated with an NYC office
(c) the employment agency itself operates in NYC

What Is a Bias Audit?

A bias audit is an independent evaluation designed to check whether an AEDT disproportionately disadvantages any group of people based on race, gender, or other protected characteristics.

Employers must ensure:

  • The audit is conducted by an independent, impartial auditor.

  • The audit examines how the tool impacts different demographic groups.

  • The audit is completed before the AEDT is used

The goal: to make sure your AI tools aren’t introducing unintended discrimination into your hiring process.

At minimum, the audit must publish selection or scoring rates and impact ratios for each sex, race/ethnicity, and intersectional group, plus the number of individuals whose demographic is “unknown.”

Legal Requirements Under Local Law 144

To legally use an AEDT for hiring or promotion in New York City, employers and employment agencies must meet three main conditions:

  1. Annual Bias Audit
    The AEDT must have undergone a valid, independent bias audit within the last year.

  2. Public Disclosure
    A summary of the most recent audit results (and the tool’s distribution date) must be publicly available on your company’s or agency’s website.

  3. Advance Notice to Candidates and Employees

    • Give notice at least 10 business days in advance to NYC-resident candidates or employees that an AEDT will be used.

    • State the specific job qualifications or characteristics the AEDT will evaluate.

    • Include instructions on how to request a reasonable accommodation or alternative selection process.

Penalties for Non-Compliance

Violating Local Law 144 can be costly. Employers or agencies that fail to comply may face:

  • Up to $500 for the first violation (and each additional violation on the same day)

  • $500–$1,500 for each subsequent violation

Each day a non-compliant AEDT is used—and each missed notice to a candidate or employee—counts as a separate violation.

Remember: The employer or employment agency—not the software vendor—is ultimately liable for completing a valid bias audit before using the tool.

Enforcement and Legal Rights

The NYC Department of Consumer and Worker Protection (DCWP) enforces LL 144; discrimination claims still go to the NYC Commission on Human Rights.

Importantly, Local Law 144 does not replace or limit an employee’s or candidate’s right to:

  • File a lawsuit under other employment laws, or

  • Bring a complaint under the NYC Human Rights Law for discriminatory practices.

When the Law Took Effect

Local Law 144 officially went into effect on January 1, 2023. Employers and employment agencies operating in New York City should now have compliant policies and documentation in place — including bias audit records, public disclosures, and clear notification procedures. Enforcement began on July 5, 2023.

Quick Compliance Checklist

Here’s a simple checklist to help ensure your organization meets the law’s requirements:

  • Conduct an independent bias audit of every AEDT used in hiring or promotions every 12 months.

  • Publish the audit summary and tool distribution date on your website.

  • Provide 10 days’ advance notice to NYC-resident candidates and employees before using the tool.

  • Disclose evaluation criteria and allow alternative processes or accommodations.

  • Keep documentation of all compliance activities.

  • If any demographic group makes up < 2 % of your data, document why it was excluded from audit calculations.

Make LL 144 painless

BiasBeacon acts as your independent auditor: drop in your model-score CSV and, minutes later, download a DCWP-ready PDF bias-audit report—complete with selection rates, impact ratios, unknown counts, audit date, distribution date, and an Auditor Independence Statement. Publish the file, prove compliance, move on.

Final Thoughts

Local Law 144 marks a turning point in how technology and employment law intersect. As AI-driven hiring tools become more common, cities and states are beginning to demand transparency and accountability from employers.

Organizations that take the lead on compliance — by auditing their tools, updating notices, and prioritizing fairness — won’t just avoid penalties. They’ll build trust with candidates and demonstrate a genuine commitment to ethical hiring practices in the age of AI.

More from this blog

The Beacon — Insights on Fair and Compliant AI Hiring | BiasBeacon

2 posts

The Beacon by BiasBeacon shares insights on fair, compliant AI hiring — covering NYC Local Law 144 audits, bias detection, and responsible automation in HR tech.