A lawsuit claiming that Facebook’s parent company allegedly violated federal housing law appears to be nearing its close as the tech giant has agreed to change its advertising system per a proposed settlement with the U.S. Department of Justice (DOJ).
The DOJ announced on June 21 that it reached a settlement agreement with Meta Platforms Inc. resolving allegations that the tech giant engaged in discriminatory advertising violating the Fair Housing Act (FHA).
The settlement, which still needs federal court approval, calls for the company to stop using several ad targeting features that the DOJ claims discriminate against Facebook users based on their race, color, religion, sex, disability, familial status and national origin.
“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division in a statement.
The settlement comes after the Justice Department filed a lawsuit in the U.S. District Court for the Southern District of New York on June 21. The case takes issue with Facebook’s ad targeting tool known as “Lookalike Audience” or “Special Ad Audience.”
According to the complaint, the tool uses a machine-learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser using several targeting options provided by Facebook.
DOJ alleges that Facebook allowed its algorithm to consider FHA-protected characteristics, including race, religion and sex when finding Facebook users who “look like” the advertiser’s source audience and are eligible to receive housing ads.
If the proposed settlement is approved, Meta is expected to stop using its tool for housing ads by December 31, 2022, which the DOJ claims rely on a discriminatory algorithm.
By that time, Meta will also need to develop a new system for housing ads to address and mitigate the disparities in targeted advertising based on race, ethnicity and sex.
Meta will also need to pay a fine of $115,054—the maximum penalty available under the Fair Housing Act.
In a blog post that followed the DOJ’s announcement, Meta officials wrote that the settlement was a product of “more than a year of collaboration with HUD to develop a novel use of machine learning technology” that would make sure that the audience that ends up seeing a housing ad more closely reflects the eligible targeted audience for that ad.
“We’re making this change in part to address feedback we’ve heard from civil rights groups, policymakers and regulators about how our ad system delivers certain categories of personalized ads, especially when it comes to fairness,” read an excerpt from the post. “So while HUD raised concerns about personalized housing ads specifically, we also plan to use this method for ads related to employment and credit. Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the U.S., and we are committed to broadening opportunities for marginalized communities in these spaces and others.”