Lecturer sheds light on algorithms of oppression built into search engines

Ansley Puckett, Reporter

Trusting the results of a Google search can feel like second nature, but these searches can sometimes yield biased results that may be unknown to its users 

Safiya Umoja Noble, author of “Algorithms of Oppression: How Search Engines Reinforce Racism,” visited App State Monday night as part of the University Forum Lecture Series. 

The book discusses “racist and sexist algorithm bias in commercial search engines,” Noble said.

Noble is an associate professor at the University of California at Los Angeles in the Departments of African American Studies and Information Studies, and is also a visiting faculty member at the University of Southern California. 

In her talk, Noble discussed ideas from her book and the ways in which algorithms and artificial intelligence can hold bias and produce racist search results.

Noble said technology is coded in by humans with bias and this leads to inappropriate and demeaning search results of certain races. 

Jackie Park
Safiya Umoja Noble gave a lecture on Monday night in which she shared messages from her book “Algorithms of Oppression: How Search Engines Reinforce Racism” and described how artificial intelligence can support discrimination.

“One of the things I felt about writing this book and this research was to highlight the many cases of what I think of as algorithmically data driven failures,” Noble said.  

Noble said it is important to realize that human-generated technology can discriminate.

“I think we want to think about a framework for what algorithmic practices exist,” Noble said. “We know that these technologies are designed by humans, and humans program and code all of their biases right into the technologies that they’re working through.”

Noble said that searches often yield results undesirable to the searcher, but they will accept the results anyway.

Nia Marshall, a junior chemistry major, said she believes speakers like Noble are important to have on campus. 

“I think (the lecture) was very inspirational, especially for marginalized identities as myself and others,” Marshall said. “I definitely feel like it should be talked about more because it’s not talked about at all.”

Marshall said she hopes Noble’s lecture will open students’ minds to new issues on campus. 

“You have some simple-minded people and people who are not open to viewing things like this, so I feel like this should definitely be a talk, so we can broaden somebody’s perspective as much as possible, especially on this campus and with different identities and different cultures,” Marshall said.

Joyce Ogburn, former dean of libraries at App State, said lectures like Noble’s shine light on important issues in our society. 

“Noble has touched on some critical points in our society today that we really need to think hard about what our future is with data and tracking and technology,” Ogburn said. “Not just from the human perspective, but the perspective on the environment and our culture and how we treat each other.”

Noble said that creating a dialogue around algorithmic data driven failures is something everyone should think about.

“One of the things that I think we should be thinking about is, of course, just making these things visible, being able to talk about them, and not taking them for granted,” Noble said.