Home Tech He didn’t get an apartment because of an AI-generated score and filed a lawsuit to help others avoid the same fate.

He didn’t get an apartment because of an AI-generated score and filed a lawsuit to help others avoid the same fate.

0 comments
He didn't get an apartment because of an AI-generated score and filed a lawsuit to help others avoid the same fate.

tthree hundred twenty-four. That was the score given to Mary Louis by an artificial intelligence-based tenant selection tool. The software, SafeRent, did not explain in its 11-page report how the score was calculated or how it weighed the various factors. It didn’t say what the result really meant. He simply showed Louis’s number and determined that it was too low. In a box next to the result, the report said: “Score recommendation: DECREASE.”

Louis, who works as a security guard, had applied for an apartment in an eastern Massachusetts suburb. At the time he toured the unit, the management company said he should have no problem getting his request accepted. Although he had a low credit score and some credit card debt, he had a stellar reference from his landlord of 17 years, who said he consistently paid his rent on time. It would also use a voucher for low-income tenants, guaranteeing that the management company would receive at least a portion of the monthly rent in payments from the government. His son, also named on the bond, had a high credit score, indicating he could serve as a backup against late payments.

But in May 2021, more than two months after he applied for the apartment, the management company emailed Louis to inform him that a computer program had rejected his application. You needed to have a score of at least 443 for your application to be accepted. There was no further explanation or way to appeal the decision.

“Mary, we regret to inform you that the third-party service we use to screen all potential tenants has denied your rental,” the email said. “Unfortunately, the service’s SafeRent leasing score was lower than allowed under our leasing standards.”

A tenant sues

Louis had to rent a more expensive apartment. The management there did not rate her algorithmically. But he discovered that his experience with SafeRent was not unique. She was part of a class of more than 400 Black and Hispanic renters in Massachusetts who use housing vouchers and said their rental applications were rejected because of their SafeRent score.

In 2022, they banded together to sue the company under the Fair Housing Act, alleging that SafeRent discriminated against them. Louis and the other named plaintiff, Monica Douglas, alleged that the company’s algorithm disproportionately ranked Black and Hispanic tenants who use housing vouchers lower than white applicants. They allege that the software inaccurately weighed irrelevant account information about whether they would be good tenants (credit scores, non-housing debts) but did not take into account that they would be using a housing voucher. Studies have shown that black and Hispanic rental applicants are more likely to have lower credit scores and use housing vouchers than white applicants.

“It was a waste of time waiting to get a rejection,” Louis said. “I knew my credit was not good. But the AI ​​doesn’t know my behavior: it knew I was late on my credit card, but it didn’t know that I always pay my rent.”

It’s been two years since the group first sued SafeRent, so long that Louis says she moved on with her life and almost forgot about the lawsuit, even though she was one of only two named plaintiffs. But their actions can still protect other renters who make use of similar housing programs, known as Section 8 vouchers for their place in the US federal legal code, from losing housing because of an algorithmically determined score.

SafeRent has reached an agreement with Louis and Douglas. In addition to making a $2.3 million payment, the company agreed to stop using a scoring system or making any recommendations when it comes to prospective tenants who used housing vouchers for five years. Although SafeRent did not legally admit any wrongdoing, it is rare for a technology company to agree to changes to its core products as part of a settlement; The most common outcome of such agreements would be a financial settlement.

“While SafeRent continues to believe that SRS Scores comply with all applicable laws, litigation is time-consuming and costly,” company spokesperson Yazmín López said in a statement. “It became increasingly clear that defending the SRS Score in this case would divert time and resources that SafeRent can better use to fulfill its core mission of providing housing providers with the tools they need to screen applicants.”

Your new AI owner

Tenant screening systems like SafeRent are often used as a way to “avoid interacting” directly with applicants and shift the blame for a denial to a computer system, said Todd Kaplan, one of the attorneys representing Louis and the group of plaintiffs who sued the company.

The property management company told Louis that the software was the only one that decided to reject her, but the SafeRent report indicated that it was the management company that set the threshold for how high the score someone had to get to qualify. that your request be accepted.

Still, even to people involved in the application process, the workings of the algorithm are opaque. The property manager who showed Louis the apartment said he didn’t understand why Louis would have trouble renting the apartment.

“They’re inputting a lot of information and SafeRent is creating their own scoring system,” Kaplan said. “It makes it harder for people to predict how SafeRent will view them. Not just for tenants who apply, even landlords don’t know the ins and outs of the SafeRent score.”

As part of Louis’ agreement with SafeRent, which was approved Nov. 20, the company can no longer use a scoring system or recommend whether to accept or reject a tenant if they are using a housing voucher. If the company introduces a new scoring system, it is required to have it independently validated by a third-party fair housing organization.

“Eliminating the approval and approval determination really allows the tenant to say, ‘I’m a great tenant,’” Kaplan said. “This makes it a much more individualized determination.”

skip past newsletter promotion

AI extends to fundamental parts of life

Nearly all of the 92 million people considered low-income in the United States have been exposed to AI decision-making in critical aspects of life such as employment, housing, medicine, education or government assistance, according to a study . New report on the harms of AI. by attorney Kevin de Liban, who represented low-income people as part of the Legal Aid Society. The founder of a new AI justice organization called Justice TechTonicDe Liban first began researching these systems in 2016, when he was approached by patients with disabilities in Arkansas who suddenly stopped receiving as many hours of state-funded home care due to automated decision-making that reduced human involvement. In one case, the state’s Medicaid dispensation was based on a program that determined that a patient had no problem with his foot because it had been amputated.

“This made me realize that we shouldn’t give in to (AI systems) as some kind of highly rational way of making decisions,” De Liban said. He said these systems are based on various assumptions based on “junk statistical science” that produce what he calls “absurdities.”

In 2018, after De Liban sued the Arkansas Department of Human Services on behalf of these patients over the department’s decision-making process, the state legislature ruled that the agency could no longer automate the determination of home care assignments. of the patients. De Liban’s was an early victory in the fight against the harm caused by algorithmic decision-making, although its use across the country persists in other areas such as employment.

Few regulations stop the proliferation of AI despite its flaws

Laws limiting the use of AI, especially in making important decisions that can affect a person’s quality of life, are few, as are avenues for liability for people harmed by automated decisions.

A survey conducted by Consumer Reportspublished in July, found that most Americans were “uncomfortable with the use of artificial intelligence and algorithmic technology for decision-making at important life moments related to housing, employment, and health care.” Respondents said they were concerned about not knowing what information AI systems used to evaluate them.

Unlike Louis’ case, people are often not notified when an algorithm is used to make a decision about their lives, making it difficult to appeal or challenge those decisions.

“The existing laws we have can be helpful, but they are limited in what they can offer you,” De Liban said. “Market forces don’t work when it comes to poor people. “Basically, the entire incentive is to produce more bad technology, and there is no incentive for companies to produce good options for low-income people.”

Federal regulators under Joe Biden have made several attempts to catch up with the rapidly evolving artificial intelligence industry. The president issued an executive order that included a framework intended, in part, to address risks related to national security and discrimination in artificial intelligence systems. However, Donald Trump has promised to undo that work and cut regulations, including Biden’s executive order on AI.

That may make lawsuits like Louis’s a more important avenue for AI accountability than ever. Already the demand got the interest from the U.S. Department of Justice and the Department of Housing and Urban Development, which manage discriminatory housing policies that affect protected classes.

“To the extent that this is a landmark case, it has the potential to provide a roadmap for how to look at these cases and encourage other challenges,” Kaplan said.

Still, holding these companies accountable in the absence of regulation will be difficult, De Liban said. Lawsuits take time and money, and companies can find a way to create workarounds or similar products for people who are not covered by class action lawsuits. “You can’t bring these types of cases every day,” he said.

You may also like