that was fast

Facebook won’t let advertisers illegally target users by race anymore

Getty Images

One of Facebook’s most controversial features was curtailed Friday—but not killed.

Facebook announced to USA Today that it would end “ethnic affinity” targeted advertising of certain types of ads after meetings with the Congressional Black Caucus, Congressional Hispanic Caucus, and advocacy groups like the ACLU and National Fair Housing Alliance.

“We are going to turn off, actually prohibit, the use of ethnic affinity marketing for ads that we identify as offering housing, employment and credit,” Erin Egan, Facebook’s vice president of U.S. public policy, told the publication.

Facebook assigns an “ethnic affinity” (or race) to some users based on the posts they like and share and their profile information. The affinities include Hispanic American, African American and Asian American. As first reported by Propublica, advertisers could target or exclude users based on those designations (which could only be seen if users went looking in their ad profiles). The feature prompted a class action lawsuit wherein plaintiffs allege the targeting violates the Civil Rights Act. Facebook countered that it doesn’t actually collect racial data on users and will fight against the suit.

screen-shot-2016-11-11-at-12-42-51-pm

Blocking exclusion by ethnic affinity when it comes to job, housing, and credit opportunities is a necessary step, but it doesn’t mean racial targeting by proxy will end on the platform. Users will continue to be assigned an ethnic affinity and advertisers will still be able to exclude or include users based on it. And this isn’t limited to Facebook. Other digital advertising networks also assign probable race to users and allow advertisers to target them accordingly.

It’s less outrageous that white users may see ads for different movies or clothing than Asian or Hispanic users, but the problem of inaccuracy in the collection of information remains. When Fusion asked users to screencap their ad preferences, Facebook made dozens of inferences about things like their religion and sexual orientation, often getting it wrong, like identifying white users as African American, misidentifying LGBT allies as queer themselves, and even implying a Jewish journalist was “interested in” a militant terrorist organization.

Facebook policy change is welcome, but doesn’t address the larger problem of incorrect inferences made via metadata collection. On the bright side, Facebook does allow you to make changes to your advertising profile, if you’re willing to go look for it.