Once again, Facebook’s insanely lucrative ad platform is under fire, regarding the questionable ways that advertisers can target ads. ProPublica has brought up numerous examples—last year it was able to exclude certain ethnic groups–including African-Americans and Hispanic Americans–from seeing certain ads; earlier this year groups such as “Jew haters” “Nazi party” could be specifically targeted by ads. In the wake of each scandal, Facebook pledged to crack down and fix the problem.
Well, not much seems to have changed. ProPublica was once again able to discriminate against certain protected groups from ads it was promoting. These included African-Americans, people interested in wheelchair ramps, Jews, Spanish speakers, etc. These, explains ProPublica, are all groups that are “protected under the federal Fair Housing Act.” The reporters were trying to promote housing ads; thus Facebook could be violating the law by allowing for such discriminations to happen.
This is yet another example of the whack-a-mole nature of Facebook’s ad platform problems. Every time an example like this pops up, the company explains that it was never supposed to happen and that it is going to fix the system. “This was a failure in our enforcement and we’re disappointed that we fell short of our commitments,” Ami Vora, vice president of product management at Facebook, told ProPublica.
Still, given its track record, we can only imagine what the next ad platform revelation will be. You can read the full ProPublica article here.
Update: Here’s Vora’s full statement about ProPublica‘s discovery:
This was a failure in our enforcement and we’re disappointed that we fell short of our commitments. Earlier this year, we added additional safeguards to protect against the abuse of our multicultural affinity tools to facilitate discrimination in housing, credit and employment. The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure. Our safeguards, including additional human reviewers and machine learning systems have successfully flagged millions of ads and their effectiveness has improved over time. Tens of thousands of advertisers have confirmed compliance with our tighter restrictions, including that they follow all applicable laws. We don’t want Facebook to be used for discrimination and will continue to strengthen our policies, hire more ad reviewers, and refine machine learning tools to help detect violations. Our systems continue to improve but we can do better. While we currently require compliance notifications of advertisers that seek to place ads for housing, employment, and credit opportunities, we will extend this requirement to ALL advertisers who choose to exclude some users from seeing their ads on Facebook to also confirm their compliance with our anti-discrimination policies – and the law.