Contact Us

Feds And Meta Settle Suit Over Alleged Discriminatory Housing Ad Algorithms


The federal government and Meta Platforms, parent company of Facebook, have reached a settlement over allegations that the tech giant practiced discriminatory advertising on Facebook, the U.S. Department of Justice announced on Tuesday.

The agreement was announced along with a lawsuit filed by the federal government against Meta in the U.S. District Court for the Southern District of New York. That court needs to approve the agreement, which would resolve the suit, before it goes into effect.

The suit alleges that Meta targeted users with housing ads based on algorithms that relied partly on characteristics protected under the Fair Housing Act.

The suit also alleged that the tech company's special ad audience tool permitted advertisers to target users based on protected traits, which is unlawful. The tool allowed advertisers to create audiences with various commonalities; lawful examples of that would be advertisers’ current customers, visitors to their websites, or people who like their Facebook posts. 

The 1968 law, and later amendments, ban discrimination in housing based on race, color, religion, sex, disability, familial status or national origin. 

Under the terms of the settlement, Meta will stop using its current algorithms for housing ads. Besides that, the company will be required to create a new system for housing ads by the end of 2022 that the government would have to approve. 

Even if the government accepts the new system, Meta would further be obliged to submit to regular third-party reviews to ensure compliance with the FHA. The settlement would be void  and the suit revived  if the new ad system isn't up to snuff by the end of the year.

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” Kristen Clarke, a DOJ assistant attorney general of the civil rights division, said in a statement.

Meta, which admitted no wrongdoing, said in a statement that it will strive "to make additional progress toward a more equitable distribution of ads through our ad delivery process."

The company also said that it has worked with the Department of Housing and Urban Development for more than a year to "develop a novel use of machine learning technology that will work to ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad."