Related Content
Press Release
Damian Williams, the United States Attorney for the Southern District of New York, along with Kristen Clarke, Assistant Attorney General for the Justice Department’s Civil Rights Division, announced today that the Justice Department has entered into a settlement agreement resolving allegations that Meta Platforms, Inc., formerly known as Facebook, Inc., engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The agreement would resolve a lawsuit filed today in the U.S. District Court for the Southern District of New York alleging that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status, and national origin. The proposed settlement is subject to the review and approval by a district judge in the Southern District of New York.
U.S. Attorney Damian Williams said: “When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods. Because of this ground-breaking lawsuit, Meta will—for the first time—change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this Office will proceed with the litigation.”
Assistant Attorney General Kristen Clarke said: “As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner. This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”
Principal Deputy Assistant Secretary Demetria McCain said: “It is not just housing providers who have a duty to abide by fair housing laws. Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD appreciates its continued partnership with the Department of Justice as they seek to uphold our country’s civil rights laws.”
Among other things, the complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the FHA. This is the Justice Department’s first case challenging algorithmic bias under the FHA.
Under the settlement, Meta will stop using an advertising tool for housing ads (known as the “Special Ad Audience” tool) which, according to the complaint, relies on a discriminatory algorithm to find users who “look like” other users based on FHA-protected characteristics. Meta also will develop a new system over the next six months to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. If the United States concludes that the new system adequately addresses the discriminatory delivery of housing ads, then Meta will implement the system, which will be subject to Department of Justice approval and court oversight. If the United States concludes that the new system is insufficient to address algorithmic discrimination in the delivery of housing ads, then the settlement agreement will be terminated.
This settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.
The United States’ Lawsuit
The United States’ complaint challenges three key aspects of Meta’s ad targeting and delivery system. Specifically, the complaint alleges that:
The complaint alleges that Meta has used these three aspects of its advertising system to target and deliver housing-related ads to some Facebook users while excluding other users based on FHA-protected characteristics. The complaint further alleges both disparate treatment and disparate impact discrimination. Specifically, the complaint alleges that Meta is liable for disparate treatment because it intentionally classifies users on the basis of FHA-protected characteristics and designs algorithms that rely on users’ FHA-protected characteristics. The complaint also alleges that Meta is liable for disparate impact discrimination because the operation of its algorithms affects Facebook users differently on the basis of their membership in protected classes.
Settlement Agreement
These are the key features of the parties’ settlement agreement:
The lawsuit is based on an investigation and charge of discrimination by HUD, which found that all three aspects of Facebook’s ad delivery system delivered housing ads based on FHA-protected characteristics. During its investigation, HUD found that Facebook allowed housing advertisers to exclude users from receiving housing-related ads through targeting options that referenced FHA-protected characteristics, and that Facebook’s machine-learning algorithm excluded users from receiving housing-related ads, even when advertisers sought to target a diverse group of Facebook users. On March 28, 2019, HUD issued a charge of discrimination at the conclusion of its investigation, and Facebook elected to have that charge heard in federal court, resulting in this lawsuit. Prior to filing this suit, this Office, consistent with its standard practice, sought to resolve these issues without litigation.
On March 29, 2019, the day after the HUD charge was issued, a judge in the Southern District of New York approved the settlement of a private litigation that addressed certain of the issues raised in the HUD charge, in National Fair Housing Alliance et al. v. Facebook, Inc., 18 Civ. 2689. Although that settlement reduced the potentially discriminatory targeting options available to advertisers, thus overlapping with some of the issues raised in the complaint the Justice Department files today, it did not resolve other problems raised in the Department’s complaint―Facebook’s discriminatory delivery of housing ads through machine-learning algorithms. The U.S. Attorney’s Office for the Southern District of New York had filed a Statement of Interest in support of the National Fair Housing Alliance case on August 17, 2018, arguing that the Communications Decency Act does not shield Facebook from liability for the delivery of housing ads.
U.S. Attorney Damian Williams and Assistant Attorney General Clarke thanked the Department of Housing and Urban Development for its efforts in the investigation.
The Fair Housing Act prohibits discrimination in housing on the basis of race, color, religion, sex, familial status, national origin, and disability. More information about the Civil Rights Division and the civil rights laws it enforces is available at www.justice.gov/crt. More information about the U.S. Attorney’s Office for the Southern District of New York is available at www.justice.gov/usao-sdny. Individuals who believe they have been victims of housing discrimination may submit a report to the U.S. Attorney’s Office for the Southern District of New York online at https://www.justice.gov/usao-sdny/civil-rights or by telephone at (212) 637-0840; may submit a report online to the Department of Justice atwww.civilrights.justice.gov; or may contact the Department of Housing and Urban Development at 1-800-669-9777 or through its website at www.hud.gov.
The case is being handled by the Office’s Civil Rights Unit in the Civil Division. Assistant U.S. Attorneys Ellen Blain, David J. Kennedy, Jacob Lillywhite, and Christine S. Poscablo filed the case.
Nicholas Biase
Victoria Bosah
(212) 637-2600