Support the ‘Prince’

Please disable ad blockers for our domain. Thank you!

noble_lecture_cropped
Dr. Safiya Noble discusses Google's role in propagation of racism.

By Yael Marans


“Stop telling your students to Google things,” said Dr. Safiya Noble, a professor at UCLA and the University of Southern California and a leading expert on how search engines control the flow of information.

Noble, whose previous career in publicity and advertising has informed her knowledge on Google’s transactional system, has helped past clients get better representation in Google search results. She explained that Google manipulates its search results, so that corporations who pay the most money secure appearance on the first page, and thus those corporations gain the most prominent representation. She proceeded to discuss revealing examples from her book, “Algorithms of Oppression: How Search Engines Reinforce Racism.”

According to Noble, a Google search of the phrase “black girls” yields a variety of pornography sites. According to Noble, the pornography industry, known for its extreme wealth and discriminating practices, gets prime representation in this way.

“I was really concerned that 10-year-olds and 13-year-olds were going to find this [porn] as representation of themselves,” Noble said. “That really created a sense of urgency for me.”

When Noble first began research on the topic in 2009, she entered an existing conversation about the problems surrounding Google, but she thinks she was filling a void within that discourse.

“There were people who were writing about the politics and power systems embedded in different kinds of platforms, and there were people who were writing about Google, but there weren’t people who were centering around black women or vulnerable people at the epicenter of the questions they might ask,” Noble said.

“Of course that was leading them to look for different kinds of evidence, or it precluded their ability to see evidence [of racism] that was everywhere,” she continued.

According to Noble, this manipulation of information flow reinforces American systems of oppression. She showed cartoons of young black girls from the Jim Crow era displayed on Google, drawing a connection to the sexualization of black women today. This demonstrates the link between historical tropes of oppression and contemporary bias.

“The only way the enslaved labor force can continue to exist is if it’s reproduced on this continent,” Noble said. “You have these kinds of stereotypes that emerge to help reproduce the economic and social power systems and keep them intact.”

According to Noble, when someone searches a popular white nationalist phrase such as “black on white crime” on Google, the search results offer multiple routes to white nationalist platforms and present no potential counterpoints such as places where the phrase appears in scholarly or activist materials.

White nationalists also co-opt phrases popular in academic circles, such as “Boasian anthropology,” to lead people to their sites when they might not be drawn to them otherwise, Noble explained.

Noble referenced the example of Dylann Roof, who murdered nine African Americans at a church in Charleston in June 2015. A blog post Roof had published shortly before the shooting revealed that he had been inspired by the website of the Council of Conservative Citizens, a white supremacist organization.

Google executives claim it is against their principles to manipulate their algorithms, which could allow them to control this phenomenon. According to Noble, though, it is clear that Google search results for the same words vary by country, making it clear that Google facilitates different results for different cultural audiences.

Colleagues and students often ask Noble why her research targets Google.

“Google is the monopoly leader,” Noble said. “You have to study the monopoly leader because everyone else is trying to do what they do.”

Noble was invited by associate professor Ruha Benjamin to be the keynote lecturer for the Year of Data conference held by the Center for Digital Humanities.

“I think she has a way of drawing in both people who are starting to think about these issues for the first time as well as provoking people who have been reflecting on it for a while,” Benjamin said. “I was riveted, and I hope everyone else in the room was, too.”

“I am almost a little bit ashamed to say that was mind-blowing for me,” said Ingvild Skarpeid, a psychology Ph.D. student visiting Princeton as a student research collaborator this year. “Because I know there is bias, but that it’s there on such an innocuous level like that tiny Google search is more jarring.”

The lecture was held on Thursday, Dec. 6, at 4:30 p.m. in East Pyne Hall. It was hosted by the University’s Center for Digital Humanities.

Comments
Comments powered by Disqus