The results of Harvard professor Latanya Sweeney’s new paper sound like the premise of a bad comedy routine, showing that online advertisements are different based on the perceived race of a searched name. You see, the ads attached to results of Google searches of white names like Brad, Luke, and Katie were all like, “Do you need contact information?” But the resulting ads from searching for names like Leroy, Kareem, and Keisha were all like, “Arrested?” Is there a problem with Google’s results, or are they just reflecting society?
The paper says that results of names commonly associated with black people, as defined by a separate study, were more likely to show links to sites that offered criminal background checks than searches of white names by a margin of 25%.
Google, not surprisingly, says they don’t do any sort of racial profiling in their results, and Sweeney, for what it’s worth, is hardly accusing them. In the paper, though, she suggests that Google searches expose a societal bias. Advertisers select words they want to target in searches, so a racist advertiser can make racist connections.
The problem is also impacted by the way Google’s AdSense algorithms work. As users respond to the ad results of a search, the weight those ads carry changes. If the search community is responding to an ad by clicking it, it becomes more likely to appear in other similar searches.
Whatever the cause, Sweeney says there is discrimination happening in the way Google is delivering its ads. She says there is only a one percent possibility her findings are based on chance, and would like to see technology used to prevent these kinds of biased search results.
If the results are just a reflection of society, though, is it right for Google to skew them?
(via The BBC)
- Sweden’s Twitter account changes hands every week so sometimes this happens
- Google searches helped show where racism impacted the last election
- Google was sued by France for pairing “Jewish” with names in search
Published: Feb 4, 2013 04:30 pm