Technological change can trigger immense and rapid shifts in how we do business. The rise of artificial intelligence and more advanced algorithms in recent years has triggered one of those shifts in hiring practices. Many employers want to do whatever it takes to find the best candidates as quickly as possible. Some companies have begun turning to automated employment decision tools (AEDTs) to enable such decision-making in a more streamlined environment.
Many of these AEDTs use artificial intelligence, but others rely on other similar processes. In every case, the scenario involves an employer handing some control over to technology. Understandably, this has made job applicants and regulators anxious about the potential for discrimination—unintentional or otherwise. There is a growing body of legislation and guidance on using AEDTs in hiring.
What does your business need to know right now? A basic understanding of these tools and the current state of regulation is a valuable foundation. Let’s begin with a definition.
What Are Automated Employment Decision Tools?
Automated employment decision tools are solutions to accelerate some parts of the hiring and selection process. Most employers still make final decisions themselves. However, AEDTs aim to narrow vast applicant pools to include only the applications most suited to the hiring criteria. Such steps can save HR time and money while ostensibly elevating the quality of each job candidate.
There are many kinds of AEDTs today, and only some of them rely exclusively on AI. Most fall under the scope of most regulations that apply to AEDTs. Some common examples of tools that employers have used in recent years include:
- Facial recognition and analysis tools evaluate a candidate during video interviews.
- Tools that read applications and resumes to identify specific skills or experience.
- Tools that calculate information from questionnaires and other user-submitted data to create candidate rankings and scores.
- AI tools that can identify resumes relevant to a specific prompt query in a large pool of applicants.
Due to the heavy compliance requirements placed on background checks for employers, these tools remain primarily focused on other areas of the hiring process. However, their rise has triggered swift attention from regulators.
What Does Federal Law Say About These Tools?
There is currently no legislation-based federal regulation for AEDTs. In 2023, though, the Equal Employment Opportunity Commission (EEOC) released nonbinding guidance for employers to understand AEDTs. Specifically, the EEOC considered how the use of automated tools squares with Title VII of the Civil Rights Act, which prohibits employment discrimination.
According to the EEOC, employers must choose tools developed with bias mitigation. Employers are responsible for the decisions these tools make, which means they can be liable if the tools discriminate. The EEOC encourages employers to ask the developers of AEDTs if they used the commission’s four-fifths rule when building the algorithm or training the AI. The rule requires a minimum 80% candidate selection ratio for all groups.
At one time, Congressional legislators set their sights on broad regulations for AEDTs. The “No Robot Bosses Act of 2023” would have made it illegal for employers to rely solely on AEDTs to make hiring decisions. The law would also have to enact regulations about anti-bias testing and human oversight of hiring. However, it did not advance.
No other legislators have yet reintroduced a similar law at the federal level. That doesn’t mean it won’t happen in the future. Employers investing in automation should carefully monitor for news on new regulation proposals.
The Current Landscape of State Laws on AI Tools
More regulatory action has occurred at the state and federal levels. Some of the earliest laws passed in 2020 focus on using technology to analyze video interview results. In Illinois, lawmakers mandated disclosures about using AI technology in video interviews. Applicants must provide their consent to the evaluation. Maryland has a similar law in place, which was passed in 2020. Facial recognition requires a consent waiver and an informative process for applicants.
The landmark law in this area comes from New York City, where legislation about AEDTs finally went into force in July 2023 after numerous delays. Employers can’t use AEDTs unless they can certify several factors, including:
- Their tool has had a professional anti-bias audit that’s no more than one year old.
- The results of the bias audit are available to the public.
- Job applicants have been given notice of the procedure.
- Applicants can ask the employers to use a method free from AEDTs.
California has attempted to pass similar rules requiring anti-bias audits, but legislators have yet to succeed. Several other states, including Vermont, Massachusetts, New York, and New Jersey, have undertaken similar efforts.
None of these efforts have yet escaped the committee even to reach a floor vote. However, as with federal law, efforts are likely to continue, especially as these tools mature and become more widespread.
Developing Strategies for Avoiding Bias in Hiring Tools
How can employers interested in accelerating their hiring processes use AEDTs while minimizing risks? Even in the absence of state regulations, it’s still possible for employers to violate Title VII through the improper use of such tools. Knowing that it’s clear that organizations have a responsibility to choose vendors and service providers with the utmost care.
Here are a few tips for exploring AEDTs and staying on the right side of the law.
When choosing an AEDT vendor, scrutinize them carefully. Ask questions about anti-bias efforts, how training occurs, and whether they used the four-fifths rule during development. Look for any evidence of prior anti-bias audits, particularly if your area of operation requires such audits.
Be open with candidates. Disclose that you use AI or automated tools during the hiring process. This practice is enforced by law in places such as New York City. Other employers may wish to use that law as a framework for informing candidates about your efforts to run a fair process.
Keep a human in the loop. Don’t rely solely on the results of all AEDTs. Set your selection criteria carefully to avoid discrimination and the unnecessary elimination of quality candidates. Review results and ensure your tools help you weed out those who wouldn’t be a good fit.
In the near and long term, AI-powered screening tools will likely become more entrenched in companies nationwide. Preparing for their eventual use and adoption today can save you time and money later. Now is the time to start assessing bias mitigation efforts for your business and its tools.
Preparing Your Business to Navigate Trends
Should your business adopt these solutions? It’s a question many employers may need to explore in the coming years. Some companies, especially those hiring at scale, may see AEDTs as essential for the future of work. Others may prefer to keep the human element as the central aspect of hiring. With so many different types of tools, some companies will likely settle on a blended approach.
One thing is sure: as automated employment decision tools spread and mature, more regulations will most likely follow. Congress may revive attempts at legislation, and state regulators may take a proactive stance towards regulation if these tools become entrenched. If time reveals more companies misusing AEDTs, legislation may follow. Employers should carefully consider the technology they adopt and prepare to meet future compliance requirements.
Get instant updates on Compliance and Legislation
About Michael Klazema The author
Michael Klazema is the lead author and editor for Dallas-based backgroundchecks.com with a focus on human resource and employment screening developments