Typing "Black girls," "Latina girls," and "Asian girls" in Google's ad system will yield highly pornographic keywords but it won't do the same if you search for "White girls," according to The Markup. It's yet another disappointing example of racial bias in the company's advertising algorithm.
Background — Google's Keyword Planner is an integral part of its massive and intimidating advertising ecosystem. It helps countless marketing teams and analysts around the world look for words and terms that associate better with their ads for products, services, and the sort. Conversely, the very same system also helps marketing individuals avoid certain terms that would likely harm their efforts to reach broader audiences.
It's a complex environment with constant tweaks and supposed improvements from the company. This particular report, however, shows that the racial bias in Google's advertising algorithm remains well and alive. As The Markup reported, the combination of one's ethnicity with "boy" or "girl" yielded radically different terms for non-White groups compared to "White boys" and "White girls."
The report depicts a juxtaposed search result for "Black girls," which yields incredibly explicit terms involving pornography and more. But when you type "White girls," the field returns empty. The report described this peculiar and rather obvious difference as the company's system containing "a racial bias that equated people of color with objectified sexualization while exempting White people from any associations whatsoever."
What Google says — A spokesperson for the company admitted that the terms were abhorrent. Of note, the representative said:
The language that surfaced in the keyword planning tool is offensive and while we use filters to block these kinds of terms from appearing, it did not work as intended in this instance. We’ve removed these terms from the tool and are looking into how we stop this from happening again.
Do better, and fast — This isn't the first time that Google has elicited criticism for allowing controversial terms to flourish in its tags and keywords system. In 2012, UCLA professor Safiya Noble highlighted the same issue regarding "Black girls" shoring up pornographic keywords in the company's engine. In 2015, Google's image labeling system would show gorillas if users searched for photos of "Black people." In 2017, the company came under fire alongside Twitter for allowing advertisers to use explicitly racist words to reach massive audiences.
While critics say that there needs to be more editorial content relevant to these terms in order to boost traffic, it's unmistakably clear that Google's Keywords Planner algorithm also needs significant improvement sooner rather than later. For a company that has been working on the world's most powerful search engine for 20 years at least, it's hard to believe that it continues stumbling on the race front.