ProPublica Exposes Facebook's Algorithmic Redlining: Digital Ad Platform Enables Housing Discrimination at Scale

Timeline Eventconfirmed
facebookhousing-discriminationfair-housingalgorithmic-redliningdigital-discriminationplatform-extraction
Financial ExtractionMedia Capture & ControlRegulatory Capture
Actors:Facebook, ProPublica, HUD, National Fair Housing Alliance
2016-10-28 · 2 min read

On October 28, 2016, ProPublica publishes an investigation demonstrating that Facebook's advertising platform allows housing advertisers to exclude users by race, a direct violation of the Fair Housing Act. Reporters purchase housing ads on Facebook and use the platform's "Ethnic Affinity" targeting category to exclude African American, Asian American, and Hispanic users from seeing them. Facebook's system accepts the discriminatory ad parameters without objection. The investigation reveals that digital platforms have rebuilt the infrastructure of housing discrimination using data and algorithms instead of redlining maps and loan officer discretion.

The "Ethnic Affinity" tool is not a glitch or an unintended consequence. Facebook built it as a feature, designed to help advertisers reach — or exclude — users based on the platform's inference of their ethnic background derived from browsing behavior, "liked" pages, location data, and social connections. The platform doesn't ask users their race; it infers it from behavioral data and then sells that inference to advertisers. This creates plausible deniability: Facebook can claim it's not offering racial targeting because "Ethnic Affinity" is technically a behavioral category, not a racial one. The distinction is legalistic; the effect is identical.

ProPublica's follow-up investigations in 2017 reveal the problem extends beyond the Ethnic Affinity tool. Facebook's broader targeting system allows advertisers to exclude users based on interests that correlate with race — "African American culture," "Hispanic culture" — or to target by zip code in ways that replicate geographic redlining. The platform's machine learning optimization compounds the problem: when an advertiser's housing ad performs better with certain demographic groups, Facebook's algorithm automatically shows it less to others, creating discriminatory distribution even without explicit exclusion.

The National Fair Housing Alliance and other civil rights organizations file complaints with HUD. In March 2019, HUD charges Facebook with violating the Fair Housing Act, finding that the platform "encouraged, enabled, and caused housing discrimination through the company's advertising platform." The charge document details how Facebook's ad delivery system discriminates even when advertisers don't explicitly request it — the algorithm's optimization for "engagement" and "relevance" produces racially disparate outcomes baked into the system's design.

Facebook settles in 2022, agreeing to overhaul its ad targeting for housing, employment, and credit and to build a new system that prevents discriminatory delivery. But the settlement addresses one platform; the underlying problem is structural. Google, Instagram, Twitter, and every major advertising platform operate on similar targeting logic. Machine learning systems trained on historical data — data that reflects decades of housing segregation — reproduce and amplify the patterns embedded in that data. The algorithm doesn't need to "know" about race to discriminate; it simply needs to optimize based on data shaped by racial inequality.

Algorithmic redlining represents the latest iteration of an extraction machine that runs continuously from the HOLC maps (1933) through FHA discrimination (1934-1968), through subprime targeting (2000-2008), to digital ad platforms (2016-present). Each technological generation updates the mechanism — from hand-drawn maps to credit scores to machine learning — while the geographic and demographic targets remain remarkably consistent. The neighborhoods that HOLC colored red in 1935 are the neighborhoods that Facebook's algorithm deprioritizes for housing ads in 2016. The extraction machine modernizes its tools; it doesn't change its targets.

Sources

  1. Facebook Lets Advertisers Exclude Users by Race — ProPublica
  2. HUD Charges Facebook With Housing Discrimination Over Company's Advertising Platform — HUD / Department of Housing and Urban Development (March 2019)
  3. Big Data's Disparate Impact — Solon Barocas & Andrew Selbst / California Law Review