The Surprising Reason Why All Google Searches Aren’t Created Equal

By Tanner Howard Mar 13, 2018

Eight years ago, as Safiya Umoja Noble was entertaining her nieces, she had a horrible revelation. Noble, now a professor at the University of Southern California’s Annenberg School of Communications, googled the phrase “Black girls” looking for useful ideas. She instead found a page full of pornography.

Disturbed that the world’s most popular search engine could create such a distorted view of Black women and girls, Noble spent the next several years tracing the consequences of algorithmic learning for marginalized people. The result is her new book, “Algorithms of Oppression: How Search Engines Reinforce Racism,” which contests the largely unchallenged assumption that Google treats every group equally. 

She argues that imbalances in the material world contribute to similar injustices online. For example, she notes that the neo-Nazi group Stormfront controlled the domain “martinlutherking.org” for years and was able to spread misinformation from near the top of the Google search pile.

Noble says that she had trouble getting people to care at first but believes that the 2016 presidential election was a wakeup call about the undersides of commercial-driven searching—like when Google surfaced inaccurate election results as a top link.

“This phenomena of people acting upon myths and disinformation circulating on these large digital media platforms is what’s ultimately important,” she said in a February phone interview. Here is an edited and condensed version of our conversation.

In “Algorithms of Oppression,” you contest the notion that “filter bubbles” are trapping people in individualized, identity-based versions of the web. You argue that Black women and other marginalized people actually can’t avoid the racial and gender violence of the material world when they go online, even when they don’t seek it out. How did that argument come together, particularly since mainstream discussion of filtering is relatively new?

One thing I try to do in the book is trace how advertising is the primary driver of search engines. While people generally think that websites that are presented are the best outcome of a search, it’s actually aligning the right optimized content with the demographics that the advertiser is speaking to.

The largest groups in the United States are the people who control the discourse or the types of content that come up about people who are in the minority. So quite often, you are seeing the interests of men who are active users of the web. In the case of girls and women, their patterns as a consumer group will over-determine what you find because they’re often looking for pornography.

Even in the case of a social media platform like Facebook, it’s the political and corporate advertising that has overdetermined the types of content that shows up in our newsfeeds. Money is always playing a big role in the manipulation of content on the web and that is the primary driver, much more than our own personal tastes.

The concept of “representation matters” has become popular. But somehow, that hasn’t gained traction in discussions around search engines like Google. Can you talk about why?

I’m trying to dislodge the [common] notion that search engines are somehow online public libraries, that they are trusted, curated public portals that lead us to the most credible information. Part of the reason why people perceive search engines this way is because when you look for information that is utilitarian in nature, such as “Where’s the closest post office,” it will lead you to pretty accurate information. Those kinds of searches reinforce our trust that more complex ideas will be represented accurately.

This is where we get into dangerous territory because of the ways in which any marginalized community cannot trust that the kind of information that comes back about them will be true, or that it hasn’t been highly optimized towards some other purpose, like pornography.

So what happens if you’re in a minority group that doesn’t have the resources or the numbers to shape those searches? Those are the things I’m trying to point to because teachers and parents are encouraging the seeking out of answers through Google. The phrase, “Just google it” has become so powerful, and it’s acculturating people to think that Google has the answers. I think we have to ask what kind of answers it’s providing.

In the book, you also point out that people most trust the search results that come up first. Why is that dangerous?

We live in a culture where we rank in all kinds of contexts. We think, "If it’s number one, it’s the best. If it’s not number one, it’s not as important." In the context of the United States, where we are so deeply embedded in notions of what or who is first, that feeds into our common-sense understanding of search engines.

You give the disturbing example of Dylann Roof and the lack of context he received when he googled "Black on white crime." That example spoke to how deeply wedded the commercialization of the web is rooted in the racism and the violence of the rest of our world.

Dylann Roof was trying to make sense of the trial of George Zimmermann and the Trayvon Martin case and went to the web and typed in a racist red herring, "Black on white crime," which is not even a phenomenon. But the search engine cannot even reframe the question, "Do you mean White-on-Black crime? Do you mean something else?" So he has no context for understanding the results he gets as White supremacist or fascist ideology.

It takes a nuanced understanding to make sense of a question like that, and I’m not sure that computer scientists who are programming search engines [have] a high level of expertise to make sense of that question at the programming level.

A telling observation you make in the book is that in the 1960s, when more women and people of color were nominally integrated into political decision-making, there was a simultaneous rise in the public’s trust in computers and the belief that technology was better than humans at making effective decisions. How did you come to that conclusion?

I study the history of computerization, and I was thinking about the way in which men in the U.S. dominated the field of information science and computing.

When women came to information science and librarianship, for example, it became a deeply feminized field where women are relegated to its service dimensions. Computing and more technical organizational database development are sequestered away to men.

It’s interesting that the intervention to allow for more voices and democratic participation in all sectors of society happens that in the 1960s, particularly around the Civil Rights Act, is the moment that we start to invest in the automation of decision-making systems, which we could also call computerization.

If we fast-forward to 2018, we are in a moment in which people are more likely to trust machines than they are human beings. If you want to check the spelling of a word, the web is an incredibly powerful tool for that. But we have such low levels of information literacy where people can’t discern facts from fiction but believe that if they go online they will get something better than what they could at school or from some other expert source. We have to think about what it means to engage with these decision-making systems that are designed and under the control of White men who control the discourse, narrative, and output in these systems. That’s concerning when we live in a multiracial democracy, where women are 50 percent of the population.

Tanner Howard is a freelance journalist and housing justice organizer who has written for In These Times, Nylon and Jacobin, among other publications.