AI in the hiring process is already controversial, but the growing intersection of AI and tenant screening has received less attention. With companies and brands seeking to integrate various AI tools into their workflows rapidly, many employers are investigating this technology. It is easy to see the potential problems with removing the human element from hiring, but what about housing?
The contentious debate over tenant background and credit screening rages on. It remains an ongoing balancing act between a landlord’s need to know the potential financial viability and safety of a tenant and concerns about fairness, second chances and discrimination. Recent policy changes and advocacy efforts aim to reform screening practices to reduce barriers for vulnerable populations.
Also, it’s essential to understand the current state of AI and its impact on tenancy: first, how it works in screening, and second, how AI tools can introduce unintended bias into the process.
Additionally, laws tend to differ widely between states. Effective September 2024, we discontinued our services for tenant screening and related credit report access. Because this is not a core market for us, we cannot support its rapidly growing regulatory burdens.
What’s the Big Deal About AI?
Finding suitable, safe tenants that provide a steady revenue stream is a significant challenge for property owners. This effort is further complicated in crowded metropolitan areas with intense competition for every open housing opportunity. AI background checks and tools that can automatically analyze a tenant’s application could help managers speed up the process and designate new tenants faster.
AI’s strength lies in its capacity to continuously learn and evolve by processing data. Therefore, it can be adapted to use natural language capabilities to parse an application, order a credit report, or place a request for a background check. Some developers envision tools that analyze these reports directly to provide a yes or no suggestion for property managers. One day, it may even be possible for a voice and chat AI to contact references on behalf of managers.
How Can an AI Be Biased?
AI isn’t truly artificial intelligence; it’s a system for making predictions and decisions based on prior data. The way an AI performs is intensely tied to its training data. As a result, the inherent biases of AI developers can make their way into the training methods. AI may also draw unfair conclusions or reach discriminatory outcomes based on the quality of its training.
We’ve already seen examples of this in early AI chatbots that quickly began to espouse discriminatory beliefs when prompted. Such problems could exist in the tools used to evaluate tenant background checks, leading to concerns that AI tools could become a serious barrier to fair housing and employment. Currently, numerous working groups in the government, including the Equal Employment Opportunity Commission (EEOC) and Federal Trade Commission (FTC), are formulating proposals for AI regulations.
Work Around AI Bias
For those interested in bringing AI into the process, the best practice is to move cautiously. Be aware of local restrictions on how you can conduct a real estate background check, as some locales have already started moving toward restricting their use. For example, New York won’t allow such tools until they complete an independent screening for bias.
For property owners and employers, it is vital to keep a human element in the process. Automating more of the workflow is smart and can save time and money. However, the ultimate decision-making should always come down to a human making decisions about other humans. This step provides the opportunity to identify flaws in AI reasoning, to think critically about an applicant’s suitability, and ultimately to come to an independent (if AI-supported) conclusion.
Looking to the Future
Ultimately, the long-term impacts of AI are difficult to predict at present. Although the gears of government regulation turn slowly, they are moving towards what could be a new regulatory environment for these tools in the future. However, that could be years away.
For now, employers and property managers should approach these tools with caution and beware of unrealistic claims from developers. Though AI and tenant screening could go hand in hand, it is also essential to be mindful of its growing pains and to maintain a human element throughout the process.
Get instant updates on Real Estate
About Michael Klazema The author
Michael Klazema is the lead author and editor for Dallas-based backgroundchecks.com with a focus on human resource and employment screening developments