The US government and Facebook parent company Meta have reached agreement on a settlement to resolve a lawsuit accusing the company of facilitating housing discrimination by making advertisers specify that ads are not shown to people belonging to specific protected groups, according to a press release from the Ministry of Justice (DOJ). You can read the full agreement below.
The government first filed a case against Meta in 2019 for algorithmic discrimination in housing, though allegations about the company’s practices date back years before that. The company took some steps to address the issue, but it clearly wasn’t enough for the FBI. The department says this was the first case to deal with algorithmic violations of the Fair Housing Act.
The settlement, which must be approved by a judge before it is truly final, says Meta will have to stop using a discriminatory housing advertising algorithm and instead develop a system that “will address racial and other inequalities caused through the use of personalization algorithms in its ad serving system.”
Meta says this new system will replace the housing-specialty advertising targeting tool, as well as credit and employment opportunities. According to the DOJ, the tool and its algorithms allowed advertisers to advertise to people similar to a pre-selected group. When deciding who to advertise to, the DOJ says Special Ad Audiences took into account things like a user’s estimated race, national origin and gender, meaning it can ultimately pick and choose who sees home ads — a violation of the law. Fair Housing Act. Meta denies guilt in the settlement, noting that the agreement does not contain an admission of guilt or a finding of liability.
In a statement on Tuesday, Meta announced that it plans to tackle this problem with machine learning, creating a system that “makes sure that the age, gender, and estimated race or ethnicity of the total audience of a home ad matches the age , the gender, and the estimated mix of race or ethnicity of the population eligible to see that ad.” In other words, the system needs to make sure that the people who actually see the ad are the target audiences that are targeted and eligible to see the ad. Meta will look at age, gender and race to gauge how far away it is. intended audience is separated from the actual audience.
By the end of December 2022, the company must prove to the government that the system works as intended and build it into its platform under the settlement.
The company promises to share its progress as it builds the new system. If the government approves it and puts it in place, a third party will “examine and verify” on an ongoing basis that it actually ensures that ads are displayed in a fair and equitable manner.
Meta also has to pay a $115,054 fine. While that’s basically nothing for a company raking in billions each month, the DOJ notes that’s the maximum amount allowed for a Fair Housing Act violation.