Facebook is again finding itself in hot water over its ads.
The social network allowed advertisers to buy ads specifically targeting “Jew haters” and people who were “interested in” other anti-Semitic topics, according to a new report from ProPublica.
The publication found that Facebook’s advertising portal contained a number of anti-Semitic categories ad-buyers could use to help target their ads on Facebook. These categories, which have since been removed, included “Jew haters,” “How to burn Jews,” and “History of ‘why jews ruin the world,” and “Hitler did nothing wrong.”
These repugnant “categories” were apparently created algorithmically because a small number of Facebook users listed them on their profiles under “interests” or “fields of study.” Facebook’s advertising tools automatically generate ad categories based on these fields.
The report notes that when ProPublica writers attempted to buy ads targeting these groups, they were forced to include several other “categories” as the number of people who had self-identified as interested in these categories was too small for a single ad buy.
Still, it’s disturbing that these categories existed at all. ProPublica says Facebook’s automatic system also suggested “Second Amendment” as a an additional category when the ones it chose were too small ” presumably because its system had correlated gun enthusiasts with anti-Semites.”
In a statement, Rob Leathern, product management director at Facebook, said the company was working to create “guardrails” that would prevent this from happening in the future.
“We don’t allow hate speech on Facebook. Our community standards strictly prohibit attacking people based on their protected characteristics, including religion, and we prohibit advertisers from discriminating against people based on religion and other attributes. However, there are times where content is surfaced on our platform that violates our standards. In this case, we’ve removed the associated targeting fields in question. We know we have more work to do, so we’re also building new guardrails in our product and review processes to prevent other issues like this from happening in the future.”