fastcocreate-dark-notxt fastcodesign-dark-notxt fastcoexist-dark-notxt fastcolabs-dark-notxt fastcompany-full-dark video-dark fastcompany-dark-square nav-plus facebook twitter instagram pinterest linkedin back menu-close menu-open twitter-2 facebook-2 modal-pinterest icon-slideshow icon-video Email Facebook GooglePlus linkedin pinterest Reddit stumbleupon Twitter mic-down-arrow
Co.Exist

The World’s Google Searches Reveal Our Secret Horrible Attitude Toward Women

These ads for U.N. Women show what happens if you type things like "women need to" into Google. The autocomplete function will suggest ways to fill in the blank based on common search terms such as "know their place" and "shut up."

We all know that Google Autocomplete functions something like a collective id, or perhaps a drunken friend who shouts out embarrassing stories about everyone in the room at a party. A new ad campaign by Memac Ogilvy & Mather Dubai developed for U.N. Women, however, reveals a darker side to the world’s most popular search terms when they include the word "women."

"When we came across these searches, we were shocked by how negative they were and decided we had to do something with them," Christopher Hunt, Ogilvy & Mather's art director, said in a statement.

Hunt and his team used searches from March 3, 2013, to display some of the oldest and most destructive attitudes toward women that come up as common browsing queries. For example, if you type in, "women need to," one of the top responses is still, "know their place," and "shut up." Type in "women shouldn’t," and Google Autocomplete suggests women avoid basic rights like, "work" or "vote."

A quick, unscientific study of men-based searches comes up with very different Autocomplete suggestions. Type in "men need to," and you’ll get "feel needed," "grow up," or "ejaculate." Type in "men shouldn’t," and you might get, "wear flip flops."

Google Autocomplete suggestions are generated algorithmically, and function as a "reflection of the search activity of all web users and the content of web pages indexed by Google." Google’s human monitors won’t limit suggestions unless they qualify as hate speech, pornography, or violence—which is helpful, in a sense, when the mass-generated Autocomplete inadvertently shows that sexism is alive, real, and influencing any number of personal or institutionalized interactions.

Yet, not all of Google’s Autocomplete or ad suggestions serve to merely demonstrate inequality. Sometimes those results can perpetuate discrimination, researchers argue. For example, earlier this year, Harvard Data Privacy Lab director Professor Latanya Sweeney found that when she typed her own first name into Google, the search engine generated ads offering criminal background checks. When she typed in 2,000 other "racially associated names," she got similar results for black-identifying names, but not when she typed in white-identifying names. Ad results like these could pose a serious problem if a candidate with a black-identifying name applies for a job and the potential employer runs a Google search, Sweeney concluded. The employer’s decision could be tainted by ads suggesting that candidate may have served time in prison, regardless of whether or not that’s true.

Nonetheless, the fact that Autocomplete can inadvertently reveal societal-level misogyny makes it a useful litmus test—even when it dredges up things we’d rather not see.


A new ad campaign by Memac Ogilvy & Mather Dubai developed for U.N. Women, however, reveals a darker side to the world’s most popular search terms when they include the word “women.”

“When we came across these searches, we were shocked by how negative they were and decided we had to do something with them,” Christopher Hunt, Ogilvy & Mather art director said in a statement.

Hunt and his team used searches from March 3, 2013, to display some of the oldest and most destructive attitudes toward women that come up as common browsing queries.

For example, if you type in, “women need to,” one of the top responses is still, “know their place,” and “shut up.”

/