Facebook settles with DOJ over discriminatory housing ads

Placeholder while article actions are loaded

Facebook owner Meta agreed to renew the social network’s targeted advertising system during a comprehensive settlement with the US Department of Justice, after the company was accused of allowing landlords to market housing ads in discriminatory ways.

The settlement, which stems from a lawsuit from the 2019 Fair Housing Act brought by the Trump administration, is the second settlement in which the company has agreed to change its advertising systems to prevent discrimination. But Tuesday’s settlement goes further than the first, and requires Facebook to overtake the powerful internal ad targeting tool, known as Lookalike Audiences. Government officials said by allowing advertisers to target housing-related ads by race, gender, religion or other sensitive characteristics, that the product enabled housing discrimination.

During the settlement, Facebook will build a new automated advertising system that the company says will help ensure that housing-related ads are delivered to a fairer mix of the population. The settlement said the social media giant had to submit the system to a third party for review. Facebook, which last year renamed the parent company Meta, also agreed to pay a fee of 115,054 dollars, the maximum penalty available under the law.

“This settlement is historic, and marks the first time Meta has agreed to end one of its algorithmic targeting tools and change the delivery algorithms for housing ads in response to a civil lawsuit,” said Assistant Attorney General Kristen Clarke of the Department of Justice’s Civil Justice Department. The rights department.

Advertisers will still be able to target their ads to users in specific locations, but not based on postal codes alone and those with a limited set of interests, according to Facebook spokesman Joe Osborne.

Facebook is now legally obligated to stop advertisers from excluding people because of their race

Facebook’s Vice President of Civil Rights Roy Austin said in a statement that the company will use machine learning technology to try to more equitably distribute who sees housing-related ads regardless of how these marketers target their ads, taking into account age, gender and probability. race of users.

“Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the United States, and we are committed to expanding the opportunities for marginalized communities in these areas and others,” Austin said in a statement. “This type of work is unique in the advertising industry and represents a significant technological advancement in how machine learning is used to deliver personalized ads.”

Federal law prohibits housing discrimination based on race, religion, national origin, gender, disability, or family status.

The agreement follows a series of legal complaints from the Ministry of Justice, a public prosecutor and civil rights groups against Facebook who claim that the company’s algorithm-based marketing tool – which specializes in giving advertisers a unique ability to target ads to thin slices of the population – has discriminated against minorities and other vulnerable groups. the areas of housing, credit and employment.

In 2019, Facebook agreed to stop allowing advertisers to use gender, age and zip code – which often act as agents for race – to market housing, credit and vacancies to users. The change came after a Washington State Attorney’s investigation and a ProPublica report found that Facebook allowed advertisers to use their micro-targeting ads to hide housing ads from African-American users and other minorities. Afterwards, Facebook said it would no longer allow advertisers to use the “ethnicity” category for housing, credit and job ads.

HUD considers Twitter’s and Google’s advertising practices as part of the housing discrimination investigation

But since the company agreed to these settlements, researchers have found that Facebook’s systems could continue to discriminate even when advertisers were banned from checking in specific boxes for gender, race or age. In some cases, the software detects that people of a particular race or gender often click on a specific ad, and then the software begins to amplify these biases by delivering ads to “look-alike audiences,” said Peter Romer-Friedman, a principal at law firm Gupta Wessler PLLC.

The result can be that only men are shown a particular housing ad, even when the advertiser did not specifically try to show only men the ad, said Romer-Friedman, who has filed several civil rights cases against the company, including the 2018 settlement in which the company agreed to limit ad targeting categories. .

Romer-Friedman said the settlement was a “great achievement” because it was the first time a platform was willing to make major changes to its algorithms in response to a civil rights lawsuit.

For years, Facebook has struggled with complaints from civil rights activists and people of color, who claim that Facebook’s enforcement would sometimes unfairly remove content where people complained about discrimination. In 2020, the company underwent an independent civil rights audit, which found that the company’s guidelines were a “huge setback” for civil rights.