



Recruiting News Network
Recruiting
News
OperationsThe Recruiting Worx PodcastMoney + InvestmentsCareer AdviceWorld
Tech
DEI
People
People on the Move
The Leaders
The Makers
People
People on the Move
The Leaders
The Makers
Brand +
Marketing
Events
Labor +
Economics
SUBSCRIBE





Technology

New York City Passed a Flawed Effort to Regulate AI in Recruiting & Hiring

Martin Burns

December 15, 2021

Technology

New York City Passed a Flawed Effort to Regulate AI in Recruiting & Hiring

Martin Burns

December 15, 2021

New York City is the first city in the U.S. this week to pass legislation aimed at mitigating the risks of discrimination associated with the use of automated employment decision tools. The bill - I.1894 - passed by the city council in early November would ban employers from using automated hiring tools unless a yearly bias audit can show they won’t discriminate based on an applicant’s race or gender. This bill is similar to legislation from Illinois as well as Maryland, but claims to be much broader in scope - both states passed laws regulating the use of video interview tools, in particular facial recognition. I.1894 would also ask makers of those AI tools to disclose more about their opaque workings and give candidates the option of choosing an alternative process - such as a human - to review their application.

But there's a fairly large loophole for vendors baked right into the bill.

The original version of I.1894 was introduced in February of 2020. The Council’s Committee on Technology held a hearing on the bill in November of 2020, where a number of civil rights and public interest organizations raised a variety of concerns, particularly with respect to the vagueness of the bill’s audit requirements, the inadequacy of its notice provisions, and the absence of strong enforcement mechanisms. In the past twelve months, the bills enforcement and reporting provisions have only gotten watered down.

In fact, when it comes to enforcement - in what may be a cynical move on the part of the Council - it’s up to the vendor to conduct and report audits demonstrating their algorithms aren’t biased. The vendor provides their own bias audits to the prospective client, and then has to offer to perform ongoing audits.

This is the equivalent of Enron performing their own audits, and - well, actually that's essentially what happened. And look what happened there.

Given the complexity and opacity of AI systems, it’s impossible to know what requiring a “bias audit” would mean in practice. As AI rapidly develops, it’s not even clear if audits would work for some types of software.

As Wired noted:

  • Some civil rights groups and AI experts also oppose the bill—for different reasons. Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, organized a letter from 12 groups including the NAACP and New York University’s AI Now Institute objecting to the proposed law. Cahn wants to regulate hiring tech, but he says the New York proposal could allow software that perpetuates discrimination to get rubber-stamped as having passed a fairness audit."

Others, including Julia Stoyanovich, director of the Center for Responsible AI at New York University, echo similar concerns:

  • Stoyanovich is concerned that the bill’s auditing requirement is not well defined. She still thinks it’s worth passing, in part because when she organized public meetings on hiring technology at Queens Public Library, many citizens were surprised to learn that automated tools were widely used. “The reason I’m in favor is that it will compel disclosure to people that they were evaluated in part by a machine as well as a human,” Stoyanovich says. “That will help get members of the public into the conversation.”

The idea of an audit is excellent - but it can't be the foxes reassuring the farmers that all is well. Venture-backed HR/ TA tech software firms are under tremendous investor pressures to make their numbers. Avoiding even the appearance of book-cooking is why independent audit financial audits are required of any publicly traded company. Without that level or reassurance, it will be hard to tell which AI is truly trending away from bias, and which is simply being wrapped to appear that way.

New York City legislators will need to consider adding third-party guard dogs to patrol the fences, and sniff out any bad behavior.

‍

‍

City decides to let foxes decide if henhouses are secure

What we're reading

‘We’re all fighting the giant’: Gig workers around the world are finally organizing

by
Peter Guest
-
rest of world

Gig workers are connecting across borders to challenge platforms’ power and policies

Got Zoom fatigue? Out-of-sync brainwaves could be another reason videoconferencing is such a drag

by
Dr. Julie Boland
-
The Conversation

I was curious about why conversation felt more laborious and awkward over Zoom and other video-conferencing software.

How to Purchase an Applicant Tracking System

by
Dave Zielinski
-
SHRM

Experts say the first step in seeking a new ATS should be to evaluate your existing recruiting processes.

View All Articles

Events

Hire Virtue's Hiring Blitz & Job Fair

Houston, TX
-
to
August 6, 2025
View All Events
Related Articles

3 Ways Technology Can Transform Talent Strategy

Britton Bloch

June 12, 2025

How Federal HR Leaders Can Use AI to Build a Modern Workforce

Michelle Clark

June 6, 2025

© 2024 recruiting news network.
all rights reserved.



Categories
Technology
Money
People
TA Ops
Events
Editorial
World
Career Advice
Resources
Diversity & Inclusion
TA Tech Marketplace
Information
AboutContactMedia KitPrivacy Policy
Subscribe to newsletter
