In today’s digital age, technology has revolutionized various sectors, including property management. However, like all powerful tools, there are challenges to be navigated. A notable concern that has been growing in the property management domain is the potential pitfalls of screening software and its implications on fair housing. Let’s delve into a case that underscores this concern.
Case Name: Louis et al. v. SafeRent et al. (D. Mass.)
The Case at Hand: Louis vs. Safe Rent
Screening software, although designed to streamline and standardize the tenant application process, has landed some companies in hot water. One such case that has garnered attention is Louis vs. Safe Rent.
As is typical of these types of cases, the crux of the Louis vs. Safe Rent case revolves around algorithms. These mathematical formulas, although seemingly neutral, can inadvertently lead to outcomes that disproportionately impact African-American and Hispanic applicants. In this particular case, the plaintiffs argue that the housing provider’s screening software, Safe Rent, unjustly denied their application based on an overall screening score derived from an algorithm. The contention is that this algorithm takes into account factors like credit history and non-tenancy-related debts, which can disproportionately disadvantage certain demographic groups.
One of the plaintiffs’ primary arguments was that they were utilizing housing choice vouchers, which, they contended, are an assurance of the ability to pay rent. Hence, it raises the question: should factors like credit history even be considered in the screening process when vouchers are in play?
The Potential Discrimination Behind Neutral Algorithms
At face value, screening software, removing the possibility of human biases, seems like the perfect tool to promote fair housing. After all, it doesn’t “know” an applicant’s race, national origin, or other protected categories. But here’s the catch: it’s not the software itself that might be discriminatory but the data and criteria fed into it.
Importantly, this doesn’t necessarily imply that management companies are actively discriminating based on race or national origin. Instead, it can be the unintentional result of criteria that may have a disparate impact on certain groups of applicants.
The Liability Aspect
An intriguing aspect of such cases is the liability of the housing provider. If the algorithms are run by the screening software, can the housing provider be held responsible? Keep in mind that housing providers usually play a significant role in devising the criteria that get fed into the software. So, in essence, they play a pivotal role in the screening process and, therefore, can be held accountable.
Is Screening Software Still Safe?
Considering the implications and the potential legal battles, is it wise for management companies to continue using screening software? The use of screening software is prevalent, and its benefits in standardizing the application process cannot be denied. However, caution needs to be exercised. The criteria being applied by the software should be carefully thought out, relevant to tenancy, and tailored to avoid unintended biases. Moreover, giving applicants a chance to appeal decisions allows a platform for addressing potential concerns.
Wrapping Up
The intersection of technology and fair housing is intricate. As we continue to rely more on algorithms and software, it’s imperative to ensure that these tools promote fairness and do not inadvertently perpetuate biases.