



Recruiting News Network
Recruiting
News
OperationsThe Recruiting Worx PodcastMoney + InvestmentsCareer AdviceWorld
Tech
DEI
People
People on the Move
The Leaders
The Makers
People
People on the Move
The Leaders
The Makers
Brand +
Marketing
Events
Labor +
Economics
SUBSCRIBE





Technology

Best practices: undergoing an AI audit for your recruitment tech

HR Dive

June 8, 2023

Technology

Best practices: undergoing an AI audit for your recruitment tech

HR Dive

June 8, 2023

Photo by Philipp Katzenberger on Unsplash

Given the rapid adoption and uncertainty surrounding AI, it’s become increasingly necessary for employers to take control of their tech stack and monitor their use of automated employment decision tools (AEDTs). For many, this will take the form of a recruitment technology audit. But undergoing an audit can be a daunting experience for any organization, especially for an HR team. Don’t fear; when approached correctly, it can be an invaluable learning experience that helps identify areas of improvement and ensures fairness in your organization’s practices. In this blog post, we’ll share some best practices for those considering where to start when conducting an AI audit.

Best Practices

Legislation such as NYC’s Local Law 144 mandates that employers must audit their AEDT recruitment technology at least once a year to ensure they are not imposing bias via these AEDTs. This is a new legal requirement for many employers, with no clear guidance on how to begin or best practices to consider when attempting to comply with the law. In order to effectively audit their recruitment technology vendors, employers need to be able to identify what bias might look like within the technology and understand the right questions to ask.

We have identified best practices to help guide employers as they navigate this uncharted territory:

  • First, map your talent acquisition journey and workflows: Create a visual representation of your talent acquisition process to identify where automated employment decision-making tools may be present. This will help pinpoint areas where unwanted bias could emerge, allow employers to see the data flow at each step, and enable employers to take proactive measures to address issues that may arise. Additionally, employers should identify the vendors and systems used at each stage to better assess the technologies involved.
  • Next, reach out to your vendors: Once your processes are mapped, it’s time to evaluate your existing or potential recruitment vendors. Questions to consider when evaluating your tech partners:
  • Have you conducted an audit to assess and mitigate biases in your technology? If so, can you provide documentation of the audit findings?
  • How has your organization taken steps to ensure bias reduction in your technology?
  • Are there any transparency statements or public announcements available to show your commitment to mitigating bias in technology?

If a vendor cannot provide answers to these questions, it could be a warning sign that their technology may not align with your organization’s commitment to reducing bias and maintaining compliance.

  • Then, give candidates a voice: Start building trust with your applicants by offering them a way to voice concerns during the recruitment process. This can be done through the application itself or an automated survey, ensuring you stay informed about potential issues as soon as they are flagged.
  • Always keep an expert in the loop: Maintain a human presence at all times to address concerns quickly and efficiently. AI is meant to enhance the human work experience, not replace it. Collaborate with your vendor to ensure a productive and compliant process.
  • Lastly, conduct a demographics survey: Gathering data on your organization’s demographics and the demographics of those who apply to your open jobs can help you accurately assess the impact of technology on your workforce through self-reported information.

Summary

Ultimately, fairness and transparency should always be the top priority. While efficiency is essential in the workplace, it should not come at the expense of fairness, and an AI audit can be an excellent opportunity for organizations to ensure that their practices are fair, transparent, and compliant.

With that in mind, PandoLogic has proactively made efforts to audit our AI technology and review our AI models’ current and future state applications to ensure, among other things, that we are helping to mitigate bias at the top of the recruitment funnel.  We believe we have a responsibility to ensure that our use of AI is transparent, compliant, and consenting. Download our explainability statement here.

Read the full report here.

‍

Given the rapid adoption and uncertainty surrounding AI, it’s increasingly necessary for employers to take control of their tech stack.

What we're reading

‘We’re all fighting the giant’: Gig workers around the world are finally organizing

by
Peter Guest
-
rest of world

Gig workers are connecting across borders to challenge platforms’ power and policies

Got Zoom fatigue? Out-of-sync brainwaves could be another reason videoconferencing is such a drag

by
Dr. Julie Boland
-
The Conversation

I was curious about why conversation felt more laborious and awkward over Zoom and other video-conferencing software.

How to Purchase an Applicant Tracking System

by
Dave Zielinski
-
SHRM

Experts say the first step in seeking a new ATS should be to evaluate your existing recruiting processes.

View All Articles

Events
No items found.
View All Events
Related Articles

AI can enable fake job applicants. How do recruiters protect themselves?

HR Dive

May 6, 2025

Why the digital employee experience is vital for business success

May 2, 2025

© 2024 recruiting news network.
all rights reserved.



Categories
Technology
Money
People
TA Ops
Events
Editorial
World
Career Advice
Resources
Diversity & Inclusion
TA Tech Marketplace
Information
AboutContactMedia KitPrivacy Policy
Subscribe to newsletter
